Smart Home technology is a well saturated market with technology that just a short few years ago many thought could not be possible. We have long talked about voice assistants and video enabled devices but now technology that was once thought to be futuristic has arrived, seemingly omnipresent in many households around the global. Not only are Artificial Intelligence enhanced video enabled devices now available but they come in many varieties, from home protection to two-way video chatting with your pet.
Video “chat” with a pet?….
When did animals start chatting?
With these ever present devices in so many households – cameras, digital assistants, smart TVs, smart thermostats and more – are we enhancing our physical privacy or actually putting it in jeopardy? Are the benefits, such as automation, smart phone remote capabilities and more, worth the risk of data privacy?
Are you taking advantage of Smart Tech, or is it taking advantage of you?
So what’s the big deal if your smart home has data that’s making your life easier? Amazon’s Alexa can make it easier for you to order more household items. Google Home can integrate with Google Nest to allow you to control your A/C by simply telling it to change the temperature. All great features that help make things a little bit more convenient in our day to day, but what exactly are these companies doing with our data?
These are great examples of transparency from these corporations and they outline relatively mundane uses for your data but it’s still important to understand. The future consequences of these corporations having your stored data should be the biggest concern. Google’s CEO Eric Schmidt said in 2010:
“One of the things that eventually happens … is that we don’t need you to type at all,” later adding: “Because we know where you are. We know where you’ve been. We can more or less guess what you’re thinking about.”Eric Schmidt
Adapt that quote to the Smart Home and eventually Google doesn’t need you to speak to Google Home, rather the A/C just changes to suit your pre-determined preferences when you arrive home because of patterns in your stored data combined with Artificial Intelligence. Alexa doesn’t need you to tell her to order paper towels, they just show up because all of your stored data has told them it’s time for another shipment.
While these specific examples of transparency regarding data storage are promising, consider how much you’re willing to give away and where the line is with your data privacy. Consider the fact that these devices are always listening and while the corporations behind them may be simply using this data to “train” their AI, the government or third party apps could be using this data for other reasons.
In 2018 law enforcement subpoenaed Amazon for Amazon Echo data as evidence in a criminal trial for murder. The lines between privacy, technology and criminal justice are changing daily. Amazon is not the only tech company that has had this happen, Fitbit and Apple have run into similar situations. While most technology companies are quick to defend consumer privacy the question still stands:
How much of your personal privacy are you willing to give away?
Letting AI into our daily lives is not something to fear but maintaining control over data and privacy should be a top concern. There’s many ways to protect your privacy, or at least limit your exposure, while utilizing the benefits of Smart Home tech. Awareness is the first step in achieving enhanced privacy. Visit 101 Data Protection Tips for a comprehensive list of ways to attempt to protect your privacy.
Up Next for NCW: Digitization and Chemical Manufacturing
- Impacts: Covid-19 & Water Utilities
- Texas Interconnection, Renewable Energy Sources & Artificial Intelligence
- April Series: Digital Transformation, Artificial Intelligence & The Future of Energy and Utilities
Thank you for a great entry, reading this raises 2 points that I would like to put out there
1. is it bad to help train “their” AI?: The answer is NO but is very loaded. A lot of the companies perform data scrubbing to only use metadata (without user-specific information), so when this data is sold or learned from they are using your behavior but without knowing it’s you. As long as this case persists I believe it is in our interest as consumers to help drive innovation. If the data is sold or used with your characteristic markers then thats a violation of privacy and should be condemned
2. “Smart” assistants: These devices have made their way into the homes of the most technologically challenged people. They have proven to be very useful, say in the case of a physically challenged individual where everything around them is automated.
The responsibility of overuse and abuse still lies with the user/consumer and its up to us to determine the extent we want to go in defining what is our privacy limit and how much we want to share.