Devices like Amazon Echo, Google Home and the forthcoming Apple Homepod are bringing artificial intelligence to the masses. They offer the potential to increase our efficiency by managing our calendar, contacts and to-do lists. With a simple verbal command, they can bring us customized news briefings and stock market reports and even brighten us up with music and jokes. I am a fan, but if you decide to invite one of these devices into your daily routine, you need to understand the privacy implications.

Imagine you are on the lookout for the perfect stock. Each morning, you awake at 4 a.m., listen to a customized “flash briefing” from your Amazon Echo, and then begin your research (The Amazon Echo is a voice-controlled digital assistant that exhibits a weak form of artificial intelligence and responds to the “wake word,” “Alexa.”). While reviewing search alert results, one company catches your attention: Blue Star Airlines (BLUSTARR). A legal analyst suggests that BLUSTARR, a small-cap regional player, is about to get a favorable ruling which will end a government investigation that has long hindered the company’s prospects.

You dig deeper, learning everything you can about BLUSTARR. Perhaps you have another digital assistant at the office—one created specifically for financial advisors (we’ll call it the “Gecko Terminal” or “GT”— To date, there is no such device, but with the fast pace of technology, more specialized devices are sure to hit the market). Every day, for a week, you ask your GT for more specifics about BLUSTARR: price updates, financial analysis, company news and announcements, etc.

Armed with your information, you are ready to make a move when you notice something odd. Your routine “flash briefing” now includes updates about BLUSTARR. Your colleagues are beginning to talk more about the stock and both of your digital assistants, Echo at home and GT at the office, provide BLUSTARR updates without being asked. Why, now, is there so much talk about this relatively obscure company? To answer this question, you need to consider what happens to the data created from interactions with a voice-controlled digital assistant (“VDA”).

First, understand that VDAs capture not only voice data, but much more. They can capture data from other connected devices, such as calendar entries, location data and web search history. Other information about your interactions, such as voice tone, inflection, volume and behavioral patterns can also be captured. And this data generates its own set of data called metadata (generally, data about data). This may include the time of recording, size or length of an audio file, and the identity and geolocation of the person requesting information.

This data helps make your VDA smarter. Emerging patterns may help personalize your user experience. If you regularly interact with your VDA at certain times of the morning and evening, your VDA may learn to turn on the lights, start your coffee and begin your flash briefing at the same time each day. In our example above, the frequency with which you suddenly start asking for updates on BLUSTARR might suggest that there is something special about the company or its stock. Your VDA (through the service provider) may take note of this, learn your preference, and adjust flash briefings accordingly. These devices operate on principles of machine learning that are designed to interpret your data and tailor responses to your interests.

There are obvious tradeoffs. To get the most utility from your digital assistant, you must give up some level of privacy. But what type of data is your service provider capturing? And, who has access to this data? Is your data available to third parties?

Consider that artificial intelligence and sentiment analysis are already being deployed for investment research. Is it possible that your stock research data might be shared with third parties, sold to data brokers, or end up being used by your competitors? What are your rights to limit the use and dissemination of such data?

In our example above, your research could be shared so that, if patterns develop, other VDAs and their service providers take note and prioritize news and information for their users accordingly. This could explain, in our hypothetical, why BLUSTARR seems to be mentioned more frequently on your colleagues’ (and competitors’) VDAs.

You may be surprised to discover that, outside of certain regulated areas (Notable exceptions include medical records covered by the Health Insurance Portability and Accountability Act of 1996 [HIPAA] Privacy Rule, financial records covered by the Gramm-Leach-Bliley Act [GLBA], and data pertaining to children under age 13, covered by the Children’s Online Privacy Protection Act [COPPA].), U.S. legal doctrine is not well-equipped to deal with the storage, use, and sharing of data collected by VDAs. If you use one of these devices, take time to review your service provider’s terms of service and privacy policy. These constitute the contract between you and your service provider, which governs your rights and obligations when using their devices. Read the “Q & A” sections and consider whether the explanations are consistent with your expectations. If not, your expectations may be unreasonable.

First « 1 2 » Next