If you just spent the long weekend lazing near a lake or river, pause for a moment and let this statistic wash over: These sources account for barely 0.3 percent of the earth’s fresh water. The rest is trapped in glaciers or underground. It isn’t just the oceans that require a whole lot more human stewardship.
Enter IBM’s ever-growing army of data scientists, who hope to use research filtered from dozens of ambitious water metering and data analytics projects over the past decade to help communities around the world get far smarter about water management.
One of the biggest experiments is its collaboration with the Rensselaer Polytechnic Institute and the Fund for Lake George.
The so-called Jefferson Project — which involves more than 60 scientists from New York, Brazil, Texas and Ireland — just reached the halfway mark. Scientists have been studying the lake manually for more than three decades, long enough to know that algae and chemical levels have changed dramatically. Much of that stems from increased use of road salt during winter months, which makes its way into the 32-mile-long lake.
“The story here is that we’re trying to advance our knowledge of how the entire ecosystem functions,” said Harry Kolar, distinguished engineer for sensor-based solutions with IBM Research.
Over the past 18 months, the team has collected enough information to improve the models being created for the 200-foot lake, which accounts for almost $1 billion in tourism for the nearby upstate New York region. The data has been gathered through an elaborate array of sensors deployed in buoys and sampling stations.
“The Jefferson Project provides the unique opportunity for biologists and environmental scientists to work closely with engineers, physicists, computer scientists and meteorologists to understand large lakes at a level of detail and intensity that is simply unprecedented,” said the project’s director, Rick Relyea, in a statement.
Here are some specifics:
- Two-day weather forecasts can be made twice daily and are far more accurate (up to a half-mile) for precipitation, wind speed and temperature.
- Water runoff data can be mapped at a much higher level of detail.
- Salt levels are measured for the six areas of the lake where the greatest impact is being felt.
- Scientists have a much more vivid idea of the lakes circulation: one informed by 468 million depth measurements, compared with 564 data points in previous models.
What’s the plan for the next year and a half?
Among other things, IBM plans a series of 3D visualizations of the existing data. It also plans to apply image-recognition software to the task of finding and cataloguing plankton. Eventually, some of these tasks will be automated, Kolar said.
“The key to protecting this precious natural resource lies in the data, and the stages is now set to discover a deluge of insights about the delicate ecology of the lake and the factors that threaten it,” he said. “The results of our efforts will help drive new ways to protect bodies of fresh water around the world.”
The role of smart sensors in leak detection has been well documented. IBM’s research at Lake George — and other projects, such as ones on the Hudson and Amazon Rivers — will tell us much more about biodiversity.
Why should a municipality, community or business care?
Kolar offers a scenario in which automated sensors might be deployed to map chemical levels and various ambient conditions, including weather. Using historical data as the guide, the city’s operations team might program an alert that triggers during a potential pollution event. That information could be used to take action on water treatment operations.
“The idea is that some of these tools and technologies will look for certain conditions or patterns or correlations people might not have thought about,” he said.
An example of how technology could be used in watershed management comes from Markham, Ontario, Canada, where IBM is teaming with the Southern Ontario Water Consortium. The project on the Grand River, the largest inland river system in the province, uses 120 sensors to collect data every 15 minutes.
Big Data also could be used when considering proposals to locate future commercial facilities on river or lakeshores.
The Beacon Institute, which worked with IBM on a series of projects on the Hudson River, uses a sophisticated system for real-time water monitoring called the River and Estuary Observatory Network, or REON.
“Real-time data is already telling us that conditions in the Hudson River near the Indian Point nuclear plant are perfect spawning groups for the endangered Atlantic sturgeon,” said Chief Research Officer James Bonner in remarks prepared for a recent talk.
“If this data existed when the plant was in its planning stages, would it be located where it is today?”