{"id":279,"date":"2018-11-27T15:58:02","date_gmt":"2018-11-27T15:58:02","guid":{"rendered":"http:\/\/blogs.kent.ac.uk\/datascience\/?p=279"},"modified":"2018-11-27T19:08:44","modified_gmt":"2018-11-27T19:08:44","slug":"personal-privacy-when-your-smartphone-becomes-an-ai-assisted-spy","status":"publish","type":"post","link":"https:\/\/blogs.kent.ac.uk\/aida\/2018\/11\/27\/personal-privacy-when-your-smartphone-becomes-an-ai-assisted-spy\/","title":{"rendered":"Personal privacy when your smartphone becomes an AI-assisted spy"},"content":{"rendered":"<p class=\"lead\">Professor Ian McLoughlin explores the state of audio AI, and what happens when that is combined with todays&#8217; smartphones.<\/p>\n<p><img loading=\"lazy\" class=\"aligncenter size-medium wp-image-285\" src=\"http:\/\/blogs.kent.ac.uk\/datascience\/files\/2018\/11\/arms-3674437_960_720-300x251.jpg\" alt=\"\" width=\"300\" height=\"251\" srcset=\"https:\/\/blogs.kent.ac.uk\/aida\/files\/2018\/11\/arms-3674437_960_720-300x251.jpg 300w, https:\/\/blogs.kent.ac.uk\/aida\/files\/2018\/11\/arms-3674437_960_720-768x642.jpg 768w, https:\/\/blogs.kent.ac.uk\/aida\/files\/2018\/11\/arms-3674437_960_720.jpg 861w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/>The vast majority of people in developed countries now carry a smartphone everywhere. And while many of us are already well aware of privacy issues associated with smartphones, like their ability to <a href=\"http:\/\/uk.businessinsider.com\/google-tracks-movements-ap-princeton-investigation-2018-8?r=US&amp;IR=T\">track our movements<\/a> or even <a href=\"https:\/\/www.forbes.com\/sites\/josephsteinberg\/2013\/06\/04\/your-smartphone-can-photograph-you-and-share-the-pictures-without-your-knowledge\/#137f156752c4\">take surreptitious photos<\/a>, an increasing number of people are starting to worry that their smartphone is actually <a href=\"https:\/\/www.vice.com\/en_uk\/article\/wjbzzy\/your-phone-is-listening-and-its-not-paranoia\">listening to everything they say<\/a>.<\/p>\n<p>There might not be much evidence for this but, it turns out, it isn\u2019t far from the truth. Researchers worldwide have begun developing many types of powerful audio analysis AI algorithms that can extract a lot of information about us from sound alone. While this technology is only just beginning to emerge in the real world, these growing capabilities \u2013 coupled with its 24\/7 presence \u2013 could have serious implications for our personal privacy.<\/p>\n<p>Instead of analysing every word people say, much of the listening AI that has been developed can actually learn a staggering amount of personal information just from the sound of our speech alone. It can determine everything from <a href=\"https:\/\/theconversation.com\/can-voice-recognition-technology-really-identify-a-masked-jihadi-52787\">who you are<\/a> and <a href=\"https:\/\/www.dailymail.co.uk\/sciencetech\/article-3393935\/Can-app-guess-accent-English-Dialects-tool-predicts-hometown-based-pronounce-26-different-words.html\">where you come from<\/a>, <a href=\"https:\/\/arxiv.org\/pdf\/1411.3715.pdf\">your current location<\/a>, your <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/pii\/S088523081630136X\">gender and age<\/a> and <a href=\"https:\/\/www.nist.gov\/sites\/default\/files\/documents\/2016\/10\/06\/lre15_evalplan_v23.pdf\">what language<\/a> you\u2019re speaking \u2013 all just from the way your voice sounds when you speak.<\/p>\n<p>If that isn\u2019t creepy enough, other audio AI systems can detect <a href=\"https:\/\/pdfs.semanticscholar.org\/fda7\/2ce8b95866dd924326b09fca35a4a68ecab7.pdf\">if you\u2019re lying<\/a>, analyse your <a href=\"http:\/\/www.good-vibrations.nl\/\">health<\/a> and <a href=\"https:\/\/www.sciencedirect.com\/science\/article\/abs\/pii\/S0167639316302692\">fitness level<\/a>, your current <a href=\"https:\/\/vokaturi.com\/\">emotional state<\/a>, and whether or not you\u2019re <a href=\"https:\/\/www.ncbi.nlm.nih.gov\/pmc\/articles\/PMC3524529\/\">intoxicated<\/a>. There are even systems capable of detecting what you\u2019re <a href=\"https:\/\/journals.plos.org\/plosone\/article?id=10.1371\/journal.pone.0154486\">eating when you speak with your mouth full<\/a>, plus a slew of research looking into <a href=\"https:\/\/www.scientificamerican.com\/article\/the-sound-of-your-voice-may-diagnose-disease\/\">diagnosing medical conditions<\/a> from sound.<\/p>\n<p>AI systems can also accurately <a href=\"http:\/\/www.lintech.org\/machine_hearing\/index.html\">interpret events from sound<\/a> by listening to details like car crashes or gunshots, or environments from their <a href=\"http:\/\/dcase.community\/challenge2018\/\">background noise<\/a>. Other systems can identify a <a href=\"https:\/\/patents.google.com\/patent\/US8078470\">speakers\u2019 attitude<\/a> in a conversation, pick up unspoken messages or detect <a href=\"https:\/\/ieeexplore.ieee.org\/document\/6289065\">conflicts between speakers<\/a>. Another AI system developed last year can predict, just by listening to the tone a couple used when speaking to each other, whether or not they will <a href=\"https:\/\/theconversation.com\/ai-can-predict-whether-your-relationship-will-last-based-on-how-you-speak-to-your-partner-81420\">stay together<\/a>. These are all examples of current AI technology developed in research labs worldwide.<\/p>\n<p>All of these technologies \u2013 no matter what they\u2019re trying to learn about you \u2013 use machine learning. This involves training an algorithm with large amounts of data that has been labelled to indicate what information the data contains. By processing thousands or millions of recordings, the algorithm gradually begins to infer which characteristics of the data \u2013 often just tiny fluctuations in the sound \u2013 are associated with which labels.<\/p>\n<p>For example, a system used to detect your gender would record speech from your smartphone, and process it to <a href=\"http:\/\/www.mcloughlin.eu\/speech\/chapter6.html\">extract \u201cfeatures\u201d<\/a> \u2013 a small set of distinct values that compactly represent a bigger speech recording. Typically, features represent amplitude and frequency information in each successive 20 millisecond period of speech. The way that these fluctuate over time will be slightly different for male or female speech.<\/p>\n<p>Machine learning systems will not only look at those features, but also how much, how often, and in which way the features change over time. While the recording happens in the smartphone itself, clips are sent to internet servers which will extract features, compute their statistics, and handle the machine learning part.<\/p>\n<figure class=\"align-center \"><img src=\"https:\/\/images.theconversation.com\/files\/244452\/original\/file-20181107-74783-baxova.jpg?ixlib=rb-1.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip\" alt=\"\" \/><figcaption><span class=\"caption\">Training an algorithm is done using big data.<\/span> <span class=\"attribution\"><span class=\"source\">Professor Ian McLoughlin<\/span><\/span><\/figcaption><\/figure>\n<p>AI was first created to perform conceptual tasks normally requiring human intelligence. At the moment, most AI systems perform analysis and understanding tasks, which means they provide information for humans to act on, rather than acting automatically.<\/p>\n<p>For example, audio AI systems for road monitoring can alert traffic controllers to the sound of a vehicle crash, and audio-based medical diagnosis AI would alert a doctor about findings of concern. But a human would still have to make a decision based on the information provided to them by the AI.<\/p>\n<p>But new AI technologies are changing. Many AI systems are starting to exceed human capabilities, with some devices even able to act without human intervention. Amazon Echo and Google Home are both examples of AI that has <a href=\"https:\/\/www.intechopen.com\/download\/pdf\/5862\">thinking abilities<\/a>. This kind of AI can respond to commands directly and can also <a href=\"https:\/\/www.researchgate.net\/profile\/Chung-Horng_Lung\/publication\/262359882_Smart_Home_Integrating_Internet_of_Things_with_Web_Services_and_Cloud_Computing\/links\/5bbcfe06a6fdcc9552dcf8b0\/Smart-Home-Integrating-Internet-of-Things-with-Web-Services-and-Cloud-Computing.pdf\">act on these commands<\/a>, like when we ask Alexa to turn on the lights or draw our <a href=\"https:\/\/www.smarthomequest.com\/best-smart-windows-drapes\/\">smart curtains<\/a>.<\/p>\n<p>While most AI systems today are designed to assist people, in the wrong hands, these technologies could look more like the <a href=\"https:\/\/en.wikipedia.org\/wiki\/Thought_Police\">Thought Police<\/a> from George Orwell\u2019s <em>1984<\/em>. Audio (and video) surveillance can already detect our actions, but the AI systems we have mentioned are starting to detect what is behind those actions \u2013 what we\u2019re thinking, even if we never speak it aloud.<\/p>\n<p>Most tech firms say their devices don\u2019t record us unless we command them to, but there have been examples of Alexa making recordings <a href=\"https:\/\/www.theverge.com\/2018\/5\/24\/17391898\/amazon-alexa-private-conversation-recording-explanation\">by mistake<\/a>. And researchers have shown that it doesn\u2019t take much to turn your phone into a <a href=\"https:\/\/www.bbc.co.uk\/news\/technology-35639549\">permanent microphone<\/a>. It may only be a matter of time before advertisers and scammers start to use this technology to understand exactly how we think, and target our private weaknesses.<\/p>\n<p>Organisations like the <a href=\"https:\/\/www.worldprivacyforum.org\/\">World Privacy Forum<\/a>, <a href=\"https:\/\/www.fightforthefuture.org\/\">Fight for the Future<\/a> and the <a href=\"https:\/\/www.eff.org\/\">Electronic Frontier Foundation<\/a> are working to ensure people have the right to privacy from digital sensing systems, or have the right to opt out from commercial surveillance. In the meantime, when you next install an app or a game on your smartphone and it asks to access all sensors on your device, just remember what you are potentially signing up to.<\/p>\n<p>These data collectors could learn to understand you as well as your closest friend and probably better, because your phone travels everywhere with you, potentially listening to every sound you make. But while we can trust a true friend with our life, can we say the same for those who are collecting our data today?<\/p>\n<p>&nbsp;<\/p>\n<p>Note: this was published by <a href=\"http:\/\/theconversation.com\">The Conversation<\/a>\u00a0on 2018\/11\/27 as &#8220;<a href=\"https:\/\/theconversation.com\/are-phones-listening-to-us-what-they-can-learn-from-the-sound-of-your-voice-105753\">Are phones listening to us? What they can learn from the sound of your voice<\/a>&#8220;.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Professor Ian McLoughlin explores the state of audio AI, and what happens when that is combined with todays&#8217; smartphones. The vast majority of people in &hellip; <a href=\"https:\/\/blogs.kent.ac.uk\/aida\/2018\/11\/27\/personal-privacy-when-your-smartphone-becomes-an-ai-assisted-spy\/\">Read&nbsp;more<\/a><\/p>\n","protected":false},"author":55472,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[124],"tags":[],"_links":{"self":[{"href":"https:\/\/blogs.kent.ac.uk\/aida\/wp-json\/wp\/v2\/posts\/279"}],"collection":[{"href":"https:\/\/blogs.kent.ac.uk\/aida\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.kent.ac.uk\/aida\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.kent.ac.uk\/aida\/wp-json\/wp\/v2\/users\/55472"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.kent.ac.uk\/aida\/wp-json\/wp\/v2\/comments?post=279"}],"version-history":[{"count":5,"href":"https:\/\/blogs.kent.ac.uk\/aida\/wp-json\/wp\/v2\/posts\/279\/revisions"}],"predecessor-version":[{"id":286,"href":"https:\/\/blogs.kent.ac.uk\/aida\/wp-json\/wp\/v2\/posts\/279\/revisions\/286"}],"wp:attachment":[{"href":"https:\/\/blogs.kent.ac.uk\/aida\/wp-json\/wp\/v2\/media?parent=279"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.kent.ac.uk\/aida\/wp-json\/wp\/v2\/categories?post=279"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.kent.ac.uk\/aida\/wp-json\/wp\/v2\/tags?post=279"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}