Archive 27/02/2024.

Why Hacktivism will lead to Digital Sovereignty

john.grant

In this context the term hacker does not refer to someone engaged in cybercrime. Rather, it refers to those motivated by curiosity and learning. In this sense, hackers are active participants engaged in shaping their digital platforms. Likewise, hackable systems do not refer to digital systems that are inherently insecure. Hackable systems are open and lend themselves to technical tinkering, modification, creative problem solving, adaptation and forking. Hackable also applies to open algorithms, machine learning and deep learning models.

As automation and AI become ever more pervasive, the scale of jobs automated away will increase accordingly. The need for lifelong learning to become the norm rather than the exception will also become ever more apparent. Hackers will gain the advantage here for a number of reasons. Firstly, pervasive computing, hyperconnectivity and the integral income generating potential of decentralised systems will impact the flow of knowledge. The diffusion effect and the bandwidth effect will increase the marginal utility of decentralised platforms per unit of time. Unlike users of today’s generation of social media platforms, hackers will be stakeholders in the decentralised systems they help to build and transact in.

In contrast to users of today’s generation of social media platforms, hackers will expend greater levels of energy developing their skill sets and digital assets. However, the energy will be used more productively; learning, collaborating, building, maintaining, and earning an income, most likely from multiple revenue streams. It is difficult to predict how trustless economies and dynamic governance will evolve. Numerai may offer one example of what to expect. On the Numerai platform, data scientists are paid in bitcoin to build an open hedge fund by modelling the stock market. Another example is Handshake, a decentralised DNS and certificate authority incorporating economic incentives.

To adapt to constant technological change, it is inevitable participatory inquiry and active learning will evolve. It will probably take several decades before efficient and resilient decentralised platforms emerge, but they will, and become formidable competitors to closed platforms and service providers. In the meantime, the scarcity of attention means that social media technologies will continue to test the boundaries of reward pathway stimulation.

How can a transition to greater digital sovereignty be accelerated? Force social networking services to seek out alternative business models by repealing Section 230.

The issue is similar to the ongoing situation between TikTok and US authorities. If technological and digital sovereignty applies to the nation state, then it should apply to individual citizens too.

jesper

Hi John

So a summary would be: “social media technologies will continue to test the boundaries of reward pathway stimulation” in the commodity space on the right of the map - but building on “automation and AI become ever more pervasive” this enables and empowers hacktivism and trends towards more open platforms?

Given that tools like Numerai and handshake are present, I would grade the field to be emerging rather than novel/concept. But that’s just position - where it’s heading is more interesting… Are there any inertia/blockers? One could be sunk cost fallacy in the existing closed platforms…

A subtrend of hacktivism is the “personalization” of medical devices. Tools and API’s exist to allow people to hack their own heart - see below - thus being more hackers than users - thus challenging the status quo and driving innovation from the left.

thank’s again

john.grant

Thank you for your comments @jesper

I would summarise it differently. As automation and AI become ever more pervasive, humans will move up the stack. To adapt, the way we learn, such as participatory inquiry and active learning, will evolve.

Numerai is an emerging community/economy. Handshake is experimental.

Alternative business models and incentives.

As an enabler of digital sovereignty, I would draw a distinction between consuming APIs on closed platforms and developing decentralised applications on open platforms.

julian.everett

Apologies for coming to this discussion late. I have been thinking a lot about this topic lately. Can you explain a bit more about the drivers for decentralisation as you see them please? It seems to me that the core differentiating raw material of centralised platforms are their vast stores of user data which allow them to mine at scale for high-value behavioural surplus, to use Shoshana Zuboff’s phrase. Unless those data stores become corrupted (as per the TikTok Trump rally hack) or legislated against, then won’t there be a fundamental selective advantage in centralisation until the point where machine learning is sophisticated enough to no longer rely on brute force training against large datasets? Until then I think there is a strong risk of platforms like Numerai just becoming data scientist Mechnical Turk as hacking skills themselves become commoditised (even should Numerai have good intentions, although I already notice the one thing they don’t allow access to is actual market data :expressionless: ). Obviously I massively hope I’m wrong on this…

john.grant

Thank you for your comments @julian.everett. To attempt to answer your question about the drivers for decentralisation, I will put the business model of the current social media platforms aside and focus instead on the broader context of my original post; the urgent need to normalise lifelong learning.

Let’s take Twitter. Its design cannot provide the bandwidth for learning. The platform, I would argue, is biased towards broadcasting rather than listening. Many participants resort to conspicuously asserting their good character or moral correctness. At this point, many others simply disengage and learning stops.

I am convinced a vast and diverse range of online communities will cooperate and play an important role in helping to normalise lifelong learning. It is going to be interesting to see how these communities address the challenges of learning in the open. For example, communities that create artefacts (open-source software for example) versus communities that mainly share experience and opinion.

With respect to the latter, the Wardley map below illustrates how topic hacking may evolve in the knowledge economy to support lifelong learning. Topic hacking borrows from hacker culture and open source and is about the creation and sharing of living documents.

julian.everett

Thanks very much for the clarifications @john.grant - definitely agree with your point about the need to normalise lifelong learning.

It seems to me that open learning has a dependency on open knowledge representation, which in turns depends on estabished open knowledge verification tools (to combat disinformation), and also open knowledge validation to enable the emergence of new knowledge. The notion that “anyone can say anything about anything” was clearly one of TBL’s design goals but seeing this in practice it has obviously posed a very major threat to the collective commons of human knowledge.

I have tried to briefly map a bit of this out below. What it really highlighted to me was that a healthy knowledge ecosystem needs its validation and verification technologies to the right hand side of everything that depends on it, and we clearly have the reverse situation right now. It is also worrying that open science doesn’t seem to be paying this enough attention currently - a vulnerability which has obviously hit journalism super-hard with the fake news black hole.

john.grant

The descent of Google’s once-great UX is described well by Elaine Scattermoon: “It’s been a trip seeing Google go from guessing at what you could possibly mean, to showing you what you meant, to showing you what marketers wish you’d meant instead.”

What is possible, though, is that we will see a change someday. And maybe sooner than later. At some point, things will turn for the better, when people decide to become citizens rather than consumers; to be activists and not just users; and to commit to collective action, rather than trying to “change the world” on their own with a click or a swipe.

Public libraries are better than Google: Google is a pay-to-play scheme, not a civic authority.

john.grant

The much-debated Section 230 of the Communications Decency Act — one of the few parts of the law to survive court challenges — currently shields web companies from accountability for the material they allow to be posted online. It places private interest above public interest. The laissez-faire approach may have made sense in the web’s early days, when the future shape of digital broadcasting was impossible to foresee, but it’s counterproductive today.

Social media companies, and those who produce content on their platforms, should operate not under a liability shield that isolates them from the public interest but under a set of rules that makes them responsive to the public interest. It’s worth remembering that Congress’s decision to license radio operators after the Titanic disaster was about more than just allocating scarce spectrum. It was about bringing those who speak to the masses out of the shadows and into the daylight of the public square. It was about making broadcasters, whether individuals, businesses, or other organizations, visible and accountable. We may not need to establish a formal licensing program for social media, but we do need to bring the spirit of the common good back to broadcasting.

An ambitious Digital Communications Act along the lines sketched out here would be complicated and controversial. It would be resisted by many powerful private interests. It would not be a panacea. But we can no longer pretend that social media will fix itself. What we face today is a societal challenge, not a technological one. Without judicious regulation, the current problems will only get worse.

How to Fix Social Media - Nicholas Carr

chris.daniel

In my eyes, social media show the most important challenge of digital societies - the ongoing conflict between the societal desire to censor harmful content and the societal fear individuals will use the same mechanisms to serve those individuals.

Any attempts to regulate social media companies require defining what public interest is, and this is the part that is easily gameable, unless the state is particularly strong.