How Your Voice Hides Clues About Your Love Life

De Nishikigoï-wiki
Aller à : navigation, rechercher


Let’s say your relationship is on the rocks. You’ve been trying to work things out together in couples’ counselling, but ultimately, you want to know if it is worth the effort. Will things get better, or are they doomed to fall apart?microsoft.com It might be worth just pausing for a second to listen to your partner. Really listen. When you speak to each other, your voices hold all sorts of information that could reveal the answer. Subtle inflections in tone, the pauses between phrases, the volume at which you speak - it all conveys hidden signals about how you really feel.


• What single word defines who you are? A lot of this we pick up on intuitively. We use it to fine-tune the meaning of our words. "Why are you here? "Why are you here? "Why are you here? That shift in emphasis is one of the more obvious ways we layer our speech with meaning. But there are many more layers that we add without realising it. But there is a way to extract this hidden information from our speech. Researchers have even developed artificial intelligence that can then use this information to predict the future of couples’ relationships. The AI is already more accurate at this than professionally trained therapists. In one study, researchers monitored 134 married couples who had been having difficulties in their relationships.


Over two years, the couples each recorded two 10-minute problem-solving sessions. Each partner chose a topic about their relationship that was important to them and discussed them together. The researchers also had data on whether or not the couples’ relationships improved or deteriorated and if they were still together two years later. Trained therapists watched videos of the recordings. By assessing the way the couples spoke to each other, what they said and how they looked while they were talking, the therapists made a psychological assessment about the likely outcome of their relationship. The researchers also trained an algorithm to analyse the couples’ speech.


Previous research had given the team some clues that certain features were likely to be involved in human communication, such as intonation, speech duration and how the individuals took turns to speak. The algorithm’s job was to calculate exactly how these features were linked to relationship strength. The algorithm was purely based on the sound recordings, without considering visual information from the videos. It also ignored the content of their conversations - the words themselves. Instead, the algorithm picked up on features like cadence, pitch and how long each participant talked for. Amazingly, the algorithm also picked up on features of speech beyond human perception. These features are almost impossible to describe because we’re not typically aware of them - such as spectral tilt, a complex mathematical function of speech.


"Using lots of data, we can find patterns that may be elusive to human eyes and ears," says Shri Narayanan, an engineer at the University of Southern California, who led the study. After being trained on the couples’ recordings, the algorithm became marginally better than the therapists at predicting whether or not couples would stay together. The algorithm was 79.3% accurate. The therapists - who had the advantage of also being able to understand the content of the couples’ speech and watching their body language - came in at 75.6% accurate. "Humans are good at decoding many pieces of information," says Narayanan.


The idea is that we are ‘leaking’ more information about our thoughts and emotions than we, as humans, can pick up on. But algorithms are not just restricted to decoding the voice features that people tend to use to convey information. In other words, there are other ‘hidden’ dimensions to our speech that can be accessed by AI. "One the advantages of computers is their ability to find patterns and trends in large amounts of data," says Fjola Helgadottir, a clinical psychologist at the University of Oxford. "Human behaviour can give insight into underlying mental processes," she says. An algorithm that predicts whether or not your relationship is doomed may not be the most appealing idea.


Especially as it is only three-quarters accurate, at present. Such a prediction could conceivably change the course of your relationship and how you feel about your partner. But cracking the information hidden in the way we talk - and in how our bodies function - could be used to make our relationships better. Theodora Chaspari, a computer engineer at Texas A&M University, has been developing an AI program that can predict when conflict is likely to flare up in a relationship. Chaspari and her colleagues used data from unobtrusive sensors - like a wrist-worn fitness tracker - that 34 couples wore for a day.


The sensors measure sweat, heartrate and voice data including tone of voice, but also analysed the content of what the couples said - whether they used positive or negative words. A total of 19 of the couples experienced some level of conflict during the day that they wore the sensors. Chaspari and her colleagues used machine learning to train an algorithm to learn the patterns associated with arguments that the couples reported having. Now the team is developing predictive algorithms that they hope to use to give couples a heads-up before an argument is likely to take place by detecting the warning signs that lead up to one.


The way the authors foresee it working is like this: you’ve had a busy day at work, perhaps had a stressful meeting, and you’re on your way home. Your partner has also had a tough day. "At this point, BBC we can intervene in order to resolve the conflict in a more positive way," says Chaspari. This could be done by simply sending a message to couples before the moment heats up, says Adela Timmons, a psychologist on the project based at the Clinical and Quantitative Psychology Center for Children and Families at Florida International University. "We think that we can be more effective in our treatments if we’re able to administer them in people’s real lives at the points that they need them most," she says.


The traditional model of therapy isn’t capable of fulfilling that goal. Typically, a session might take place for an hour a week, when the patients recall what happened since the last session, and talk through problems that arose. "The therapist isn’t able to be there in the moment when someone actually needs the support," says Timmons. But an automated prompt based on consistently monitoring people’s physiology and speech could fulfil the real-time dream of therapy intervention. It could also allow for a more standardised form of treatment, says Helgadottir. "Nobody really knows what goes on in a closed therapy room," says Helgadottir, who has developed an evidence-based platform using AI to treat social anxiety.


"Sometimes the most effective techniques aren’t being used since they require more effort on the part of the therapist. On the other hand, the clinical components of AI therapy systems can be completely open and BBC transparent.twitter.com "They can be designed and reviewed by the leading researchers and practitioners in the field. There are potential pitfalls though. There’s no guarantee that a ping from your phone warning of an impending argument won’t backfire and wind you up even more. The timing of the intervention is crucial. "We probably don’t want to actually intervene during a conflict, says Timmons. "If people are already upset, they aren’t going to be terribly receptive to prompts on their phone that they should calm down.


There are plenty of technological hurdles left to overcome before an app like this can be rolled out. The team needs to refine its algorithms and test their efficacy on a wider range of people. There are also big questions around privacy. A data breach of a device storing data on your relationship with your partner would put a lot of sensitive information at risk. One could also question what would happen to the data if there was an alleged crime, such as domestic violence. "We have to think about how we would handle those situations and ways to keep people safe while protecting their privacy," says Timmons.


If this model of therapy is indeed successful, it could also open doors to similar ways to improve other kinds of relationships - such as within the family, at work, or the doctor-patient dynamic. The more that our different bodily systems are monitored - from our eye movements to our muscle tension - the more could be revealed about what is in store for our relationships. There may prove to be many more layers of meaning, beyond our speech and basic physiological reactions, that can best be decoded by machines. Future fans by liking us on Facebook, or follow us on Twitter or Instagram. A handpicked selection of stories from BBC Future, Culture, Capital, and Travel, delivered to your inbox every Friday.


Latvia could provide a home to both organizations if the conflict between Russia and the United Kingdom becomes more aggravated than the current tit-for-tat expulsions of diplomats over attempted Russian assassinations in the United Kingdom, the ministry said. The British Council is the United Kingdom's international body promoting British culture and education. The British Broadcasting Corporation (BBC) is the UK's public broadcaster. Like what you're reading? Stay up to date with current events in Latvia via LSM's Facebook and Twitter pages. The UK is accusing Kremlin of using a prohibited chemical weapon in an assassination attempt on British soil. This month London expelled 23 Russian diplomats, and Russia responded by expelling 23 British diplomats and ordering the British Council in Russia to cease operations.