New Nokia 3310 announced ahead of the Mobile World Congress tech show in Barcelona.
To get started, select ‘Voice typing’ in the ‘Tools’ menu when you’re using Docs in Chrome. Say what comes to mind—then start editing and formatting with commands like “copy,” “insert table,” and “highlight.”
Researchers at the University of California, Riverside have developed a new, more computationally efficient way to process data from the Global Positioning System (GPS), to enhance location accuracy from the meter-level down to a few centimeters.
The optimization will be used in the development of autonomous vehicles, improved aviation and naval navigation systems, and precision technologies. It will also enable users to access centimeter-level accuracy location data through their mobile phones and wearable technologies, without increasing the demand for processing power.
The research, led by Jay Farrell, professor and chair of electrical and computer engineering in UCR’s Bourns College of Engineering, was published recently in IEEE’s Transactions on Control Systems Technology. The approach involves reformulating a series of equations that are used to determine a GPS receiver’s position, resulting in reduced computational effort being required to attain centimeter accuracy.
First conceptualized in the early 1960s, GPS is a space-based navigation system that allows a receiver to compute its location and velocity by measuring the time it takes to receive radio signals from four or more overhead satellites. Due to various error sources, standard GPS yields position measurements accurate to approximately 10 meters.
Differential GPS (DGPS), which enhances the system through a network of fixed, ground-based reference stations, has improved accuracy to about one meter. But meter-level accuracy isn’t sufficient to support emerging technologies like autonomous vehicles, precision farming, and related applications.
“To fulfill both the automation and safety needs of driverless cars, some applications need to know not only which lane a car is in, but also where it is in that lane–and need to know it continuously at high rates and high bandwidth for the duration of the trip,” said Farrell, whose research focuses on developing advanced navigation and control methods for autonomous vehicles.
Farrell said these requirements can be achieved by combining GPS measurements with data from an inertial measurement unit (IMU) through an internal navigation system (INS). In the combined system, the GPS provides data to achieve high accuracy, while the IMU provides data to achieve high sample rates and high bandwidth continuously.
Achieving centimeter accuracy requires “GPS carrier phase integer ambiguity resolution.” Until now, combining GPS and IMU data to solve for the integers has been computationally expensive, limiting its use in real-world applications. The UCR team has changed that, developing a new approach that results in highly accurate positioning information with several orders of magnitude fewer computations.
“Achieving this level of accuracy with computational loads that are suitable for real-time applications on low-power processors will not only advance the capabilities of highly specialized navigation systems, like those used in driverless cars and precision agriculture, but it will also improve location services accessed through mobile phones and other personal devices, without increasing their cost,” Farrell said.
What drives you to Facebook? News? Games? Feedback on your posts? The chance to meet new friends?
If any of these hit home, you might have a Facebook dependency. But that’s not necessarily a bad thing, says Amber Ferris, an assistant professor of communication at The University of Akron’s Wayne College.
Ferris, who studies Facebook user trends, says the more people use Facebook to fulfill their goals, the more dependent on it the become. She is quick to explain this dependency is not equivalent to an addiction. Rather, the reason why people use Facebook determines the level of dependency they have on the social network. The study found those who use Facebook to meet new people were the most dependent on Facebook overall.
To identify dependency factors, Ferris and Erin Hollenbaugh, an associate professor of communication studies at Kent State University at Stark, studied 301 Facebook users between the ages of 18 and 68 who post on the site at least once a month. They found that people who perceive Facebook as helpful in gaining a better understanding of themselves go to the site to meet new people and to get attention from others. Also, people who use Facebook to gain a deeper understanding of themselves tend to have agreeable personalities, but lower self-esteem than others.
Ferris explains that some users observe how others cope with problems and situations similar to their own “and get ideas on how to approach others in important and difficult situations.”
Ferris and Hollenbaugh presented “A Uses and Gratifications Approach to Exploring Antecedents to Facebook Dependency” at the National Communication Association conference in Las Vegas in November. They say other Facebook dependency signs point to users’ needs for information or entertainment. In other words, a user knows about the local festival scheduled for this weekend thanks to Facebook.
In their previous studies, “Facebook Self-disclosure: Examining the Role of Traits, Social Cohesion, and Motives” (2014) and “Predictors of Honesty, Intent, and Valence of Facebook Self-disclosure” (2015) published in the journal Computers in Human Behavior, Ferris and Hollenbaugh also uncovered personality traits common among specific types of Facebook users.
For example, people who use Facebook to establish new relationships tend to be extroverted. Extroverts are more open to sharing their personal information online, but are not always honest with their disclosures, Ferris says.
The most positive posts online come from those who have high self-esteem, according to Ferris.
“Those who post the most and are the most positive in posts do so to stay connected with people they already know and to gain others’ attention,” Ferris says. “This makes a lot of sense – if you are happy with your life, you are more likely to want to share that happiness with others on social media.”
Since the earliest times, laughter and humor have performed important functions in human interaction. They help to expedite courtship, improve conversational flow, synchronize emotional states and enhance social bonding. Jokes, a structured form of humor, give us control over laughter and are therefore a way to elicit these positive effects intentionally. In order to comprehend why some jokes are perceived as funny and others are not, Robert Dunbar and colleagues at Oxford University investigated the cognitive mechanism underlying laughter and humor. The research is published in Springer’s journal Human Nature.
The ability to fully understand other people’s often unspoken intentions is called mentalizing, and involves different levels of so-called intentionality. For example, an adult can comprehend up to five such levels of intentionality before losing the plot of a too-complex story. Conversations that share facts normally involve only three such levels. Greater brain power is needed when people chat about the social behavior of others, because it requires them to think and rethink themselves into the shoes of others.
The best jokes are thought to build on a set of expectations and have a punchline to update the knowledge of the listener in an unexpected way. Expectations that involve the thoughts or intentions of people other than the joke-teller or the audience, for example the characters in the joke, are harder to pin down. Our natural ability to handle only a limited number of mindstates comes into play.
In order to shed light on how our mental ability limits what we find funny, the researchers analyzed the reaction of 55 undergraduates from the London School of Economics to 65 jokes from an online compilation of the 101 funniest jokes of all time. The collection mostly consisted of jokes from successful stand-up comedians. Some jokes in the compilation were mere one-liners, while others were longer and more complex. A third of the jokes were factual and contained reasonably undemanding observations of idiosyncrasies in the world. The rest involved the mindstates of third parties. The jokes were rated on a scale from one to four (not at all funny to very funny).
The research team found that the funniest jokes are those that involve two characters and up to five back-and-forth levels of intentionality between the comedian and the audience. People easily loose the plot when jokes are more complex than that. The findings do not suggest that humor is defined by how cleverly a joke is constructed, but rather that there is a limit to how complex its contents can be to still be considered funny. According to Dunbar, increasing the mentalizing complexity of the joke improves the perceived quality, but only up to a certain point: stand-up comedians cannot afford to tell intricate jokes that leave their audience feeling as if they’ve missed the punchline.
“The task of professional comics is to elicit laughs as directly and as fast as possible. They generally do this most effectively when ensuring that they keep within the mental competence of the typical audience member,” says Dunbar. “If they exceed these limits, the joke will not be perceived as funny.”
It is likely that everyday conversational jokes do not involve as many intentional levels as those that have been carefully constructed by professional comedians. Further research needs to be conducted in this area. However, Dunbar’s findings shed some light on the mechanics of language-based humor and therefore on the workings of our mind.
A new computer algorithm can predict whether you and your spouse will have an improved or worsened relationship based on the tone of voice that you use when speaking to each other with nearly 79 percent accuracy.
In fact, the algorithm did a better job of predicting marital success of couples with serious marital issues than descriptions of the therapy sessions provided by relationship experts. The research was published in Proceedings of Interspeech on September 6, 2015.
Researchers recorded hundreds of conversations from over one hundred couples taken during marriage therapy sessions over two years, and then tracked their marital status for five years.
An interdisciplinary team .then developed an algorithm that broke the recordings into acoustic features using speech-processing techniques. These included pitch, intensity, “jitter” and “shimmer” among many – things like tracking warbles in the voice that can indicate moments of high emotion.
Taken together, the vocal acoustic features offered the team’s program a proxy for the subject’s communicative state, and the changes to that state over the course of a single therapy and across therapy sessions.
These features weren’t analyzed in isolation – rather, the impact of one partner upon the other over multiple therapy sessions was studied.
Once it was fine-tuned, the program was then tested against behavioral analyses made by human experts ‹ who had coded them for positive qualities like “acceptance” or negative qualities like “blame”. The team found that studying voice directly – rather than the expert-created behavioral codes – offered a more accurate glimpse at a couple’s future.
Next, using behavioral signal processing, the team plans to use language (e.g., spoken words) and nonverbal information (e.g., body language) to improve the prediction of how effective treatments will be.
“Trending” topics on the social media platform Twitter show the quantity of tweets associated with a specific event. However, trends only show the highest volume keywords and hashtags, and may not give qualitative information about the tweets themselves. Now, using data associated with the Super Bowl and World Series, researchers at the University of Missouri have developed and validated a software program that analyzes event-based tweets and measures the context of tweets rather than just the quantity. The program will help Twitter analysts gain better insight into human behavior associated with trends and events.
“Trends on Twitter are almost always associated with hashtags, which only gives you part of the story,” said Sean Goggins, assistant professor in the School of Information Science and Learning Technologies at MU. “When analyzing tweets that are connected to an action or event, looking for specific words at the beginning of the tweets gives us a better indication of what is occurring, rather than only looking at hashtags.”
Goggins partnered with Ian Graves, a doctoral student in the Computer Science and IT Department at the College of Engineering at MU. Graves developed software that analyzes tweets based on the words found within the tweets. By programming a “bag of words,” or tags they felt would be associated with the Super Bowl and World Series, the software analyzed the words and their placement within the 140 character tweets.
“The software is able to detect more nuanced occurrences within the tweet, like action happening on the baseball field in between batters at the plate or plays in the game,” Graves said. “The program uses a computational approach to seek out not only a spike in hashtags or words, but also what’s really happening on a micro level. By looking for low-volume, localized tweets, we gleaned intelligence that stood apart from the clutter and noise associated with tweets related to the World Series.”
Goggins feels using this method to analyze tweets on a local level can help officials involved with community safety or disaster relief to investigate the causes of major events like the Boston bombing or to help predict future events.
“Most of the things that happen on Twitter are not related to specific events in the world,” Goggins said. “If analysts are just looking at the volume of tweets, they’re not getting the insight they need about what’s truly happening or the whole picture. By focusing on the words within the tweet, we have the potential to find a truer signal inside of a very noisy environment.”
The study, “Sifting signal from noise: a new perspective on the meaning of tweets about the ‘big game,'” was published in the journal, New Media and Strategy.