what I’m working on in 2018
Concerns about the internet have hit the mainstream. People are talking about the downsides of social media, election manipulation, algorithmic bias, privacy, the power of dominant platforms like Google and Facebook, cybersecurity, and what AI is going to do to the world. Most of these issues are not new; folks have been working on them for many years. There’s a lot of different, intersecting questions here. It’s frustrating for me as an engineer with a background in emerging internet technologies that it’s hard to find good projects to work on, which are useful, stand a chance of scaling, and avoid harm as far as possible. As I’m something of a generalist, who likes to be building things and tackling problems, I’m working on a few of what seem to be the underlying issues, to see if we can find better ways to build things in the future. This is why I have a somewhat diverse portfolio at present. This post is a rough attempt to draw all the threads together, and explain why I’m doing what I’m doing….
At Doteveryone in 2017 I established the Responsible Technology programme, trying to find practical ways to change tech industry behaviour. We didn’t take forward the idea of a trustmark for various reasons, and instead Doteveryone are developing a toolkit to help startups and SMEs to work through how they might be more responsible as they develop internet products and services. Through the Trustworthy Tech Partners project, we prototyped parts of a modern trustmark system with a group of startups, and found that even amongst those seeking to do the right thing and with strong ethical motivations, it was helpful to provide a structure and to help the businesses think more holistically about good practice, going beyond their main focus areas (such as privacy, or fair procurement). Of course, this requires the business to want to be responsible in the first place.
The growing “tech ethics” movement seems to be raising interest in ethical practice amongst tech workers, and there’s been a few recent cases where employees have raised concerns and influenced a business change, and excitement about ideas like ethical pledges. Such ideals are perhaps easier in theory than practice. We’ll see whether that scales up; on one hand, I’m optimistic as millennial workers are evidently more keen on meaningful work, but equally there are still many people (across generations) who are more interested in money and power than being ethical. Some of them will probably overtly talk a good line about ethics, too, whilst it is fashionable to do so.
Some of us have done our best to be responsible in our work for a long time. It’s not always easy. You need to go talk with folks from different backgrounds and sectors to understand if you are actually doing the best thing (and to get your ideas reviewed to see if they are reasonably at all — whether in a formal ethics review, or something less structured). Proper engineering can take more time, if you think through corner cases and different future scenarios, to design a system which is suitably secure and resilient, and works well for real people. In the consumer internet of things, it feels like a lot of modern products do very little in this regard. The sort of engineering we did at AlertMe — over a decade ago — would be regarded as overengineering by the standards of a lot of recent ventures. We can, and should, do better. The work of great groups like I am the Cavalry, fighting for more secure connected devices and systems in medicine, vehicles, and industry, should be a niche concern, not something desperately required almost across the board.
We can work to get good engineering practice required by formal or informal standards, demanded by customers, and recognised by business (and investors) as basic requirements, rather than corners that can be cut to save time and money.
What more structural changes do we need?
Many internet concerns relate to personal data online. We’re all waiting to see what sort of difference GDPR makes here, to the big companies, and to the data brokers we often forget, and to small businesses, some of which may be tomorrow’s giants. New models to give individuals more control over their data start to hint at very different data futures; these include HAT and Databox. We need to be building these things, trying such ideas out, finding different business models that can move us past the attention economy and surveillance capitalism. They won’t all work, but we cannot simply wait around to see what GDPR enforcement looks like or achieves. I’m looking forward to MyData in August to see what else is happening in this area.
Getting some new ideas moving, and experimenting around new concepts of how connected systems can be built, deployed, operated and governed is something I hope we’ll be doing more of at Cambridge, in the new initiative I’m getting off the ground. The Trust & Technology Initiative brings together and drives forward interdisciplinary research from Cambridge and beyond to explore the dynamics of trust and distrust in relation to internet technologies, society and power; to better inform trustworthy design and governance of next generation tech at the research and development stage; and to promote informed, critical, and engaging voices supporting individuals, communities and institutions in light of technology’s increasing pervasiveness in societies. We’re launching in September. It’s been a fascinating few months so far of uncovering a great deal of super research around topics of trust and power and tech, and spotting opportunities for new projects. Improving the internet — making it more resilient, better governed and better for people — needs consideration of politics as well as technology. We need to design with the knowledge that other people have very different intents and motivations, both within our own communities and culture, and at a larger scale around the world.
Maybe we will move to a more distributed future. This could mean less centralisation down at the network routing layer, or moving from big data and compute centres to more edge processing. Most of the discussions about internet ethics don’t think about these layers, but they underlie everything else, and affect real people through censorship and control of access to the internet. The Public Stack Summit was a great reminder of this, and it was encouraging to see people working on these issues.
Of course, we can also wait for whatever follows capitalism, but that seems a rather long term play. I’d like to think we could do better in the here and now, although that might mean moving away from traditional tech funding and delivery structures for consumer internet services. (These are particularly prone to network effects and centralisation — which would perhaps be OK if they were better governed, and didn’t further drive wealth inequality by creating value for a very few elites.) The standard venture capital backed startup doesn’t deliver good internet services, in the main, as pressure to grow and exit is in tension with good practices. It’s perhaps more amenable in B2B settings. Social enterprises can be useful to address specific niche applications, but don’t seem viable for more infrastructural systems. The Zebra movement is encouraging tech businesses to pursue funding and resourcing routes that don’t demand growth at all costs. I think there’s still a missing piece around who benefits from the value created though.
I’m intrigued by the potential of co-ops in this regard. Platform co-operativismis a growing movement and shows promise in some areas; worker co-operatives in tech are connecting and growing in the UK. I think that co-ops may be a better solution for delivering consumer internet services, too, which is why I co-founded diglife.coop last year. The Digital Life Collective is a grand experiment with whether a mass member-owned and member-run coop can deliver better everyday internet services to people than Silicon Valley does — preserving our privacy, with accessible, reliable, equitable tech. With lots of members, even contributing modest amounts, we could have significant resources to enable the co-op to provide open source tech that is well designed, convenient and supported, meeting everyday needs. And there’s a good chance we can set it up so that it’s better governed than the tech coming from the giants of the Valley (or China). (Progress on the Collective has been a lot slower than I’d like, but I still believe this is an essential experiment to try out — in this incarnation or another.)
Through the Digital Life members, I’ve learnt about other promising projects, which seek to build resilient, distributed connected futures, with better data and governance practices. It’s especially good to see brave endeavours which recognise the need for ecosystem change — just creating a single project, or open source tool, is not going to be enough. Holochain is creating a platform for scalable distributed applications, including hardware bits and incentive structures; their thinking around keeping data where it needs to be is really compelling. (If only I could feel more confident that their underlying maths was really sound — still looking for an independent review…) RChain is also interesting — with a community co-op as well as a separate investment vehicle — and a goal of being a more useful distributed compute platform which can do lots of transactions, rapidly, without ridiculous energy burn. Designing governance — at least to some extent — at the same time as thinking about radical new technology architectures seems a useful way forward. Governance for very decentralised systems could look quite different to what we think about around organisational governance today. Then there’s hashgraph, which is somewhat similar; and specific application areas like Faircoin (which is supposed to be stable, incorruptible and eco-friendly), and filesystems for next generation web like IPFS…
If only these projects were easier to grasp (or evaluate) for folks not already embedded in the weird wider world of blockchain! (And this is outside the fevered world of coins and ICOs, get rich quick schemes and dodgy code…) Even tech folks from other bits of the internet aren’t coming in and demanding to know up front whether or not a system is Turing complete and Byzantine fault tolerant. Hopefully they will get more accessible with time, and those of us who can’t entirely grasp the maths in the inevitable white papers will be able to find trustworthy reviews from independent experts. It feels like a lot of the people campaigning for better internet practices are often unaware of these new concepts and projects coming down the tracks. I’m excited to see how these different projects present and form alliances at Open:2018.
I think there’s potential for more fundamental shifts in how the internet works. At the Public Stack Summit, we talked about what our ideal internet would look like (with Euro-centric values, a strong sense of both privacy and personal control, and of common goods) — and the potential limits of this utopian dream. The idea of a new role for public and common goods in some places in the internet stack feels a good one though. Having open, common components of data or code, or open standards enforced across carefully chosen layers of the stack, might be a promising way to drive change. It’s going to take a lot of work though, as the internet and all the associated services and systems are highly complex and diverse. There are a lot of moving parts, whether you look at the technical side (code, data, who is actually running services and where, who is making hardware and where and how), or the governance side (regulations, standards, checks and balances, at local, national and online levels), or the diversity of intersecting applications (identity, money, algorithms/AI, content, safety-critical systems, resilient infrastructure, usable everyday applications, security tools).
Different facets of this are starting to be discussed and explored. “Data trusts” are proposed by the UK government’s AI report — whatever exactly these are, they will be a legal construct, not a technical one, seeking to make sure the right people have the right access to data at the right time. AI is an interesting case here, where data and machine learning system interconnect and where we ought to think in advance about where power and value creation occur. My former Open Knowledge colleague Rufus Pollock proposes a fundamental shift in intellectual property, and remuneration rights, to change the landscape of control and ownership. As well as on the data/information side, I think there’s scope for a new generation of powerful and effective open standards at certain layers of the stack, for certain applications. Not an easy architectural challenge, but something that could change the dynamics of the systems involved in a potentially useful way. We’ll need economists, political and social scientists as well as technologists to design these, and to think with an adversarial mindset, not just an optimistic one.
In a landscape of many potential interventions, I’m most interested in the ones which bring together people and technologies, testing and building useful systems for tomorrow. I’m less interested in changing tech culture, or in regulation development, or in creating design patterns to help with today’s issues— other people are better suited to that work.
We need new business models, new ways to create technology where the governance delivers for society as well as individuals, and to build resilient systems. We should be researching and experimenting with these things, to create a better environment for tomorrow’s cutting edge technologies. I think of myself as a builder, working with deeply technical teams and connecting them to real human and practical needs. I’d like to be working on emerging technologies again, once I find an opportunity with the sweet spot of useful application, plausible and responsible business model and financing, fair pay and conditions. But right now I’ll be a connector and catalyst and translator a little longer, researching issues of trust, power and technology, experimenting with co-ops and learning about the next generation distributed systems, and helping shift today’s tech industry where possible. (Not to mention my side projects — raising the profile of looking after the things that matter, especially common goods and infrastructure through the Festival of Maintenance; and early stage work on what might become something not entirely unlike an accelerator, to support those working on new ventures tackling meaningful challenges at scale.)