With the advance of clouds and aggressive invasion of “social services” like Facebook, MySpace, Google* etc. it looks like there is no space left for person’s private data (Naomi Klein’s “No Space” comes to mind). As soon as information is fed to Facebook, Google or other entity it stops being the property of that person and becomes property of the company. Another thing that is happening is annihilation of local services, local communities and removal of local knowledge (it sounds that in Egypt’s reversing the trend helped the revolution). At present to know your community which is right at your doorstep you have to go to Facebook, Twitter, MySpace, etc. and explore it there. It’s not hard to imagine disappearance of Facebook* services one day (either entirely – the South-Park way or partially – the Facebook way ). That could have some very measurable negative impact on community hooked on such services. The scenario can be reapplied multiple times for different “cloud providers” and for different “communities”. In other words people are in a great danger of losing not only their personal data but also their collective/community data. Imagine losing all books of Dickens overnight (or books of Orwell) or any other cultural heritage that doesn’t belong to a single individual but entire nations or even entire planet.
There’s a solution. The most antagonized creation of IT industry – BitTorrent. Content publishers of all kinds (MPAA, RIAA, BlahAA etc.) are all after BitTorrent users, ISP’s are after BitTorrent throttling it down to a trickle, Software manufacturers for the most part are scared out of their minds and media is demonizing BitTorrent users. Above are all the entities that want to own person‘s data but don’t want to give back much: Blu-Ray wants to know all about person’s movies and lock him out if it doesn’t like something, ISP’s want to know what person is doing online and sell him/her out to the highest bidder, software producers want to know consumer’s every move and turn it into a commodity or force-feed him advertisements – the common line is to strip consumer of his privacy, his rights and commoditize him/her. As per Google:
As Google says in their own words, to their investors:
Who are our customers? Our customers are over one million advertisers, from small businesses targeting local customers to many of the world’s largest global enterprises, who use Google AdWords to reach millions of users around the world.
And as Mathew Ingram sums up in his article:
As the saying goes: If you’re not paying for it, then you’re the product being sold.
Linking all of the above and brilliant presentation by Mark Pesce some things come to mind:
Peer2Peer distribution + Localization + IPv6 = Freedom
Above needs some explanation and requires some technical skill to grok. Equation is actually much more complicated than above and here’s what it translates to (or born from?).
Following Mark Pesce‘s logic the more popular is resource the more available it is. Note also that resource does not exist in any single location, instead it exists on dozens of computers all at the same time. What such distribution creates is a bonus for any sort of freedom movement (WikiLeaks anyone?) as it removes single point of entry (ISP, Domain Registrar, government, etc.) that can be sued, or scared into droping hosting of such content. Just like what Mark is arguing about (and like everybody knew for a while now) once content is published online – it starts the life of it’s own and can’t be contained. Only in Peer2Peer scenario survival rate is even higher.
Private Peer-to-Peer networking seems to be developing too: N2N, RetroShare, etc. Which brings us one step closer to implementation.
Back to our equation: localization is needed to retain community information within community (because of it’s high appreciation and value in this context) while making it available to everybody outside at the speeds proportionate to demands. In other words if your town has a pile of resources it wants to share primarily locally and if anybody is interested outside of community as well – the law of latencies helps here. Currently ISP’s are the gate-keepers so if there’s no ISP in town – no data sharing for you. In other words tech-savvy communities are hostages of ISP’s. Alternative is a local mesh network that doesn’t need ISP. All the “spare parts” are readily available – WiFi-equipped devices are on every corner so turning them all into access points could create a local “roaming zone”. With Peer2Peer – based content distribution (think HTTP-over-BitTorrent) community can host it’s own sites/forums/mailing lists/you name it without ever needing provider. It’s even possible to use different carriers – HTTP-over-SMS, old school dial-ups, even pulling ethernet cable across the driveway to your neighbour’s house, Bluetooth, Infra-red, etc.
Localization is good but inter-community communications are still needed. Now is time to invoke FidoNet – asynchronous distributed network of semi-autonomous nodes. Brilliant idea that was both right for it’s time and too advanced for it’s time. Taking a close look at node organization it is exactly like described above except it required phone lines. That is where IPv6 comes into play. FidoNet had node list and network addresses assigned from central authority, but essentially addresses were unlimited, just like IPv6. If we take IPv6 as a transport layer – we’ve almost resolved problem of compatible addresses across the globe – every single machine can have unique address and routing can be done based on that. Now idea doesn’t seem so crazy and distant, does it?
Couple more details to make it more attractive and add more meat to it: since we’ve got mesh networking and IPv6 protocol, and BitTorrent-like distribution of content we have freed ourselves from the hard dependency on specific physical media for transport. Whether it’s a phone line from my house to neighbour’s or shared WiFi or P2P Radio Antennas, or Ham Radio or pidgin mail – when locally somebody makes a request for pageX that is not part of local community’s infrastructure, it’s download it scheduled throughout community network of nodes and with the first possibility of download being downloaded to computer of whoever requested it. Now pageX is local. Next person asking for pageX will get it locally! More popular page – more people locally will store it so as per Mark Pesce – download speed goes up. A-ha! With the clever mechanisms of caching and expiry it’s not so hard to devise a fairly efficient method of keeping things that are of interest to population readily available (and not controlled by anybody).
Now next aspect of this theme is permanent local storage. While in above scenario people keep on downloading and storing locally other people’s stuff it’s important for that “other people’s stuff” to exist. All that needs to be done is having “local storage” defined on all the nodes, where content of local storage, just like with BitTorrent and other Peer2Peer networks is shared freely upon request with the rest of the world but permanently resides on local computer (unlike cached content that person requested today or yesterday which can expire tomorrow). In which case user’s machine becomes the “host” for the content, but if content becomes popular – burden of serving it is shifted to the… wait for it… wait for it… cloud!
Above resolves the problem of content ownership and content’s persistance. If I like what I downloaded – I move it to my local storage making it something that I host permanently, now there are 2 hosts hosting the same content (with the same signature) on Peer2Peer network. It looks like having 3 different types of storage should resolve majority of usecases: private store, public store and cache store. Private holds data you do *not* intend to share with anybody (personal documents, pictures, etc.), public store holds [personal] information intended for sharing – movies, sites, files, music, documents, etc., and cache would store only transient data – data that person downloaded for whatever reason and is keeping for the time being to speed up subsequent access (and this part is the only one controlled by automatic measures of expiry etc.).
Above may sound far-fetched, but something is already happening in this domain – FreedomBox Foundation have just started it’s operations but if you look at the goals – they are already thinking in that direction:
We’re building software for smart devices whose engineered purpose is to work together to facilitate free communication among people, safely and securely, beyond the ambition of the strongest power to penetrate, they can make freedom of thought and information a permanent, ineradicable feature of the net that holds our souls.
Currently it looks like they are at the point where they target only communication itself, not data preservation, but why wouldn’t it be a next step?
To get around ISP getting overly sneaky and curious – layer of Tor could be implemented between inter-community nodes or even throughout the community.
Imagine applications for sharing information. Assume person A lives in community X. Now, A goes on a trip to community Y, of course he brings his laptop (?) with him. While at the bus station everybody in close proximity get to “know” what A knows and share content with him (if they choose to) – anonymously and at great speeds (and without paying fees to the carrier).
Last piece that is missing in all of the above is out-of-the-box hardware/software platform that would support that. FreedomBox doesn’t seem to have goals that reach this far, and we won’t witness any great movements from Google, Microsoft, Apple or any other existing commercial entity that is not deeply rooted in OpenSource world. All of the proprietary vendors are gearing their operations towards other corporate/commercial entities rather than average person (as it was mentioned and proven earlier). It is not in their interest – without our data they have nothing to sell.