January 28, 2021 | By Greg Dougherty
By the day, the realities concerning various global problems such as climate change, disease and overpopulation, are becoming ever more alarming. Take global climate change, for instance. Every hour, a piece of land the size of a football field falls into the Gulf of Mexico. As the world’s temperatures rise and rainfall patterns shift, small islands in the Atlantic, Pacific and Caribbean are drying out. Climate change is actually impacting the rotation of the Earth.
As these reports demonstrate, we are at a tipping point in the struggle against climate change, where the policies that are agreed upon today will significantly impact the quality of life for future generations. For this reason, it’s imperative that our political leaders and scientists have access to information gleaned from Big Data rather than received opinion when establishing environmental policies.
That’s right— the same Big Data collection process that ensures global enterprises can make sound business decisions can be leveraged to ensure a viable ecological future.
That being said, there are some hurdles that must be cleared in relation to how organizations receive, manage and distribute the unfathomable amount of information they are collecting. I recently participated in a NYTECH forum on how technology is being used to help solve a variety of global challenges, like global warming, and we spent time talking about the power and the limitations of Big Data. Panel members all acknowledged that, thanks to the Internet of Things (IoT), more data is available today than ever before. However, most organizations are completely unprepared to handle it.
Some of the data-related issues facing the research community today include:
Identifying and separating interesting data: Big data is, well, very big. A single 787 aircraft, for instance, can create half a terabyte of information on one flight. So a major part of the challenge regarding climate change is ensuring that critical incoming data doesn’t wind up sitting unused on a server, or gets accidentally deleted.
Processing real-time information: Fresh, fast-moving data needs to be processed, indexed and compared to historical records in real-time.
Verifying data for consistency: Big Data, above all else, must be accurate and consistent. Businesses storing large volumes of information must be able to verify the integrity of their collection and storage management processes.
I’m confident that the growing pains related to the proliferation of the IoT are not long-term issues. Right now, for instance, there are already many great tools that can be used to help analyze large pools of data, including Hadoop, Storm, Spark, MapR and NoSQL. As these services become more refined and readily accessible to the public, it will become easier and more affordable for organizations to utilize the vast streams of information they are collecting.
The hope is that soon enough it will be possible to formulate the long-term environmental impacts of traffic congestion, oil spills, carbon emissions and manufacturing production. Big Data will make it possible for political and business leaders, as well as ordinary citizens, to better understand how their personal and professional decisions impact the global community and our planet.