The IoT — Internet of Things — is an omnipresent part of our lives, supporting everyday tasks across home, work, fitness and more. Distributed sensors, wireless networking and cloud computing enable us to control lights, monitor security systems, and manage entertainment and HVAC systems, all from our smart phones. In our work lives, smart building technologies sense how many people are in a conference room and automatically adjust lower the thermostat set point, dim smart windows to block out sunlight, and more. Along the way, these systems are gathering and using critical data that can be analyzed to optimize the systems and provide users with valuable insights.
Thoughts from the team
We speak with many customers about their lab-monitoring strategies and goals. Every day. And it’s becoming very clear that the concept of the Smart Lab is taking hold. There are, however, two distinct camps.
We work with a variety of teams and see Smart Lab projects that run the gamut from simple equipment monitoring to sophisticated analysis of multiple data streams. Equipment and lab monitoring gives teams visibility into operational issues and ambient conditions to ensure that everything is operating as expected, while Smart Lab technologies can also help teams understand the other factors that often contribute to repeatability in their work.
Imagine showing up to work on Monday morning only to find that all of your work for the last year has been erased by a malfunctioning freezer. Now, imagine that you had also started monitoring some equipment in your lab, but had decided that the malfunctioning freezer wasn’t quite important enough to justify the expense of monitoring it. That would be a painful moment.
The IoT – Internet of Things — has become an accepted part of our daily lives. Distributed sensors, wireless networking and cloud computing enable us to control lights and HVAC systems from our smart phones, either from home or from a hotel room while on a business trip. More recently for scientific professionals the same IoT technology stack applied to the lab has spawned the concept of the IoLT – the Internet of Lab Things – and just as the IoT enables the Smart Home, the IoLT enables the enables the Smart Lab.
I wanted to take a break from our regularly scheduled programming to highlight a recent true user success story, which illustrates the importance of collecting all the data – whether you think you need it or not.
This particular academic user- a student researcher – was doing growth studies on a specific model plant organism (Pisum sativum var. saccharatum), testing the effect of various experimental soil amendments on germination and seedling growth. The experiment was conducted indoors under ostensibly climate controlled conditions, using natural light exposure. Germination in non-dormant seeds is triggered by the presence of water – which re-hydrates stored food within the seed and activates hydrolytic enzymes – and oxygen. Under normal conditions, P. sativum is expected to germinate between 7-9 days after planting. The entire experiment was scheduled with a 9-day germination budget as this was a time-critical study and previous trials had yielded germination times in as little as 6 days. Almost as an afterthought, an Elemental Machines Element-A already in the general lab area was placed directly in the experiment pod to monitor light, humidity and temperature. (For reference, Element-As monitor ambient temperature, humidity, air pressure and light levels. )
Every single day, lab managers and scientists ask me why they should invest in a ‘smart lab’ when they already have solutions in place that “work just fine.” So let’s talk about what ‘just fine’ really means.
In most labs, ‘just fine’ refers to the tried-and-true–albeit antiquated–techniques used to support cutting-edge research. It often involves a lab tech walking around to manually check, record and transcribe temperature readings in the mistaken belief that a single data point over the course of the day is sufficient. In other cases, it’s the false comfort from a Chart Recorder that it is tracking readouts continuously. Except, upon closer inspection, it’s overwriting valuable data because someone forgot to change the paper disc at the beginning of the month.
Since the very first line of code was written, there have been software bugs (one apocryphal story traces the origin of the term to 1946 when an actual moth was found trapped in the relays of a Harvard mainframe – and like all good code, was dutifully documented by being taped into a notebook). As a result, the development of debugging tools has closely mirrored the rise of modern software. From symbol tables and breakpoints to the sophisticated predictive code profilers of the 21st century, better debugging tools have enabled us to create sophisticated and smoothly functional software.
Now that 2017 resolutions have been made (and, perhaps, already broken), I’m going to go on record with the prediction that 2017 is going to be the year of the Smart Lab. From industrial and commercial sectors, to the smart home, smart technologies have made major inroads in every category except one — scientific research.
As a technologist and a scientist, it is very gratifying to see the technologies that have transformed business and consumer markets being applied to the challenges in the laboratory and beyond. Miniaturization of components, sensors, machine learning, wireless communications, and big data have shaped other industries, and are finally positioned to make a huge impact in the science-based organizations that are doing the most for human-kind.