I’ve been invited to give a “very provocative talk” on what humanitarian response will look like in 2025 for the annual Global Policy Forum organized by the UN Office for the Coordination of Humanitarian Affairs (OCHA) in New York. I first explored this question in early 2012 and my colleague Andrej Verity recently wrote up this intriguing piece on the topic, which I highly recommend; intriguing because he focuses a lot on the future of the pre-deployment process, which is often overlooked.
I’m headed to the Philippines this week to collaborate with the UN Office for the Coordination of Humanitarian Affairs (OCHA) on humanitarian crowdsourcing and technology projects. I’ll be based in the OCHA Offices in Manila, working directly with colleagues Andrej Verity and Luis Hernando to support their efforts in response to Typhoon Yolanda. One project I’m exploring in this respect is a novel radio-SMS-computing initiative that my colleague Anahi Ayala (Internews) and I began drafting during ICCM 2013 in Nairobi last week. I’m sharing the approach here to solicit feedback before I land in Manila.
The “Radio + SMS + Computing” project is firmly grounded in GSMA’s official Code of Conduct for the use of SMS in Disaster Response. I have also drawn on the Bellagio Big Data Principles when writing up the in’s and out’s of this initiative with Anahi. The project is first and foremost a radio-based initiative that seeks to answer the information needs of disaster-affected communities.
China is having a light-bulb moment. Scientists from the Shanghai Institute of Technical Physics have discovered that a microchip embedded one-watt LED bulb is capable of emitting Wi-Fi, with enough signal strength to provide internet for four computers.
The discovery, aptly named “Li-Fi,” relies on the use of special LED light bulb that operate with light as the carrier instead of traditional radio frequencies.
Data rates as fast as 150 megabits per second were achieved with the new Li-Fi connection, making it faster, cheaper and more energy efficient than traditional Wi-Fi signals.
Li-Fi apparently only uses five percent of the energy required to power Wi-Fi-emitting devices, which rely on energy cooling systems to supply Internet to cell towers and Wi-Fi stations.
Though the discovery has huge potential in the way we use Internet connection, Li-Fi is still in a crude testing stage, since it doesn’t work if the light bulb is turned off or if light bulbs are blocked. That doesn’t seem like such a huge burden, though: it just means you’ll have to leave your lights on if you want to surf the Web. No more online shopping binges in the dark!
Li-Fi demonstrations will take place on November 5 in Shanghai at the International Industry Fair, where 10 kits will be tested out. A bright future seems to be in store for Li-Fi usage, which could range from using car headlights or focused light to transmit data, among many other potential applications.
People everywhere have been organizing a more ethical economy, but they work in relative isolation, fragmented by geography, sector, and even organizational form.
Many organizations collect information about a small piece of these efforts. In every situation, there is another organization for which that information overlaps. In every case there is an opportunity to share that will strengthen all the organizations participating.
Sharing requires effort, it requires trust, and it requires infrastructure. The Data Commons is a cooperative of organizations that are sharing – sharing the costs of this effort, trusting each other with their information, and building infrastructure to make sharing is easy.
Members of the Data Commons Cooperative are principled economic organizations that want it to be easy to share with each other, and with the world, in the movement for a more ethical economy.
Examples of information overlap
Uniting the movements
The budget crunch is hitting everyone. IT departments are being asked to slim down and do more with less. Apparently the government is no exception. The affordability of open source has the government’s attention and is changing the content management and enterprise playing field. Read more about the changes in the Information Week article, “Feds Move To Open Source Databases Pressures Oracle.”
The piece begins:
“Under implacable pressure to slash spending, government agencies are increasingly embracing open source, object-relational database software at the expense of costly, proprietary database platforms. That’s putting new pressure on traditional enterprise software providers, including Oracle, to refine their product lineups as well as their licensing arrangements.”
So giants like Oracle are feeling the crunch, and it is trickling down throughout the proprietary world. But many organizations might not feel comfortable going completely open source, as in creating their own customized solution. So many are turning to a smart compromise, a value-added open source solution like LucidWorks. Customers get the affordability and agility of open source, but the support and expertise of an industry leader. Check out their support and services for assurance that going open source does not mean you will be left out on your own.
Emily Rae Aldridge, August 6, 2013
Posted: 02 Aug 2013 04:44 PM PDT
As technology advances quickly, so do security concerns. It stands to reason that new technologies open up new vulnerabilities. But open source is working to combat those challenges in an agile and cost-effective way. Read the latest on the topic in IT World Canada in their story, “Open-Source Project Aims to Secure Cloud Storage.”
The article begins:
“The open source software project named Crypton is working on a solution that would enable developers to easily create encrypted cloud-based collaboration environments. There are very few cloud services that offer effective encryption protection for data storage, according to Crypton. Security has always been the top concern for many enterprise organizations when it comes to cloud services and applications.”
It is reasonable that enterprises are concerned about security when it comes to cloud services and storage. For that reason, many prefer on-site hosting and storage. However, some open source companies, like LucidWorks, build value-added solutions on top of open source software and guarantee security as well as support and training. And while LucidWorks offers on-site hosting as well, those who venture into the Cloud can have the best of both worlds with cost-effective open source software and the support of an industry leader.
Emily Rae Aldridge, August 5, 2013
Back in 2003 visionary artist Anne-Marie Schleiner wrote an inspiring paper entitled “Fluidities and Oppositions among Curators, Filter Feeders and Future Artists” describing the future role of online curators as nature’s own filter feeders. Anne-Marie is clearly referring to curators to and filter feeder in art world, but her rightful intuitions are equivalently applicable to the larger world of information, data, digital and content curation as well.
But let me explain better.
First. The term “filter feeders” is used in nature to describe a group of animals which thrives on its ability to filter organic matter floating around them. From Wikipedia: “Filter feeders are animals that feed by straining suspended matter and food particles from water, typically by passing the water over a specialized filtering structure. Some animals that use this method of feeding are clams, krill, sponges, baleen whales, and many fish (including some sharks). Some birds, such as flamingos, are also filter feeders. Filter feeders can play an important role in clarifying water, and are therefore considered ecosystem engineers.” From Wikipedia: “In marine environments, filter feeders and plankton are ecosystem engineers because they alter turbidity and light penetration, controlling the depth at which photosynthesis can occur.”
Second. If you re-read this last sentence slowly and look at what it could mean if applied to the field of content curation, it would read to me something like this: “In large information ecosystems like the web, filter feeders/content curators and content itself are ecosystem engineers because they: a) directly influence our ability to inform ourselves effectively and to discern truth from false and useless info (turbidity) b) shed light and clarity on different subjects which would otherwise remain obscure (light penetration) c) determine our ability to make sense of our own generated information streams (photosynthesis).” A very inspiring parallel indeed, giving a way to visualize the true importance and role that curation, disenfranchised from the confines of museums and art galleries, could have on the planetary information ecosystem. Anne-Marie writes: “Most web sites contain hyperlinks to other sites, distributed throughout the site or in a “favorites” section. Each of these favorite links sections serves as a kind of gallery, remapping other web sites as its own contents. Every web site owner is thus a curator and a cultural critic, creating chains of meaning through association, comparison and juxtaposition, parts or whole of which can in turn serve as fodder for another web site’s “gallery.” Site maintainers become operational filter feeders, feeding of other filter feeders sites and filtering others’ sites. Links are contextualized, interpreted and “filtered” through criticism and comments about them, and also by placement in the topology of a site. The deeper a link is buried, the harder it may be to find, the closer to the surface and the frontpage, the more prominent it becomes, as any web designer can attest to. I am what I link to and what I am shifts over time as I link to different sites… … In the process, I invest my identity in my collection – I become how I filter.” Anne-Marie vision (2003), pure and uninfluenced by what we have seen emerge in the last few years, paints a very inspiring picture of the true role of content curators and of the key responsibility they do hold for humanity’s future. Inspiring. Visionary. Right on the mark. 10/10
Many large Web companies have failed to adopt a decades-old encryption technology to safeguard confidential user communications. Google is a rare exception, and Facebook is about to follow suit.
June 26, 2013
Revelations about the National Security Agency’s surveillance abilities have highlighted shortcomings in many Internet companies’ security practices that can expose users’ confidential communications to government eavesdroppers.
Secret government files leaked by Edward Snowden outline a U.S. and U.K. surveillance apparatus that’s able to vacuum up domestic and international data flows by the exabyte. One classified document describes “collection of communications on fiber cables and infrastructure as data flows past,” and another refers to the NSA’s network-based surveillance of Microsoft’s Hotmail servers.
Most Internet companies, however, do not use an privacy-protective encryption technique that has existed for over 20 years — it’s called forward secrecy — that cleverly encodes Web browsing and Web e-mail in a way that frustrates fiber taps by national governments.
Lack of adoption by Apple, Twitter, Microsoft, Yahoo, AOL and others is probably due to “performance concerns and not valuing forward secrecy enough,” says Ivan Ristic, director of engineering at the cloud security firm Qualys. Google, by contrast, adopted it two years ago.
Jean Lievins: The Networked Society — DISRUPTIVE Technology Rules — and the Most Disruptive of All Technologies is C4ISR Technology that is Also Open Source
It’s about doing the impossible – faster
Technology is transforming how everybody builds solutions and faster access to the latest technology gives you an unfair advantage. I work in Silicon Valley and we benefit from that unfair advantage. This is because the technology being invented here is not incremental but disruptive.
You will notice the inclusion of Guardtime signatures. By signing all objects with Guardtime signatures it means we no longer have to trust the cloud provider – another game changer! A technology that scales so well it has been included in rysylog.