A/B TESTING ON ONLINE USERS TELLS RESEARCHERS WHAT COLORS MAKE - TopicsExpress



          

A/B TESTING ON ONLINE USERS TELLS RESEARCHERS WHAT COLORS MAKE PEOPLE CLICK & BUY AND EVEN MANIPULATES THE INFO ON YOUR SCREEN TO SEE WHAT TYPE OF EFFECT, IF ANY, IT HAS ON YOU, JUDGING BY THE MOOD/TONE OF YOUR SOCIAL MEDIA ACTIVITY/POSTS - ALL WITHOUT YOUR CONSENT: Websites are using hidden tricks to make you click or buy without realizing, and the way they do it can be both baffling and controversial. The internet is one big experiment, and you’re part of it. Every day, millions of trials are manipulating what you see when you browse online, to find out how to keep your attention, make you click more links – and spend more money. And these experiments are often secret. You’ll probably never know you were part of them. This is all thanks to something now well-known in the tech industry, called A/B testing. It means that the web pages served to you are not necessarily the same as those shown to the next person – they might have slightly different colours, an alternate headline or, on social networks, you could be shown different personal information about your friends and family. What started as a way to tweak website design is becoming increasingly controversial – in the most divisive cases, A/B testing can help companies sway people’s mood or even their love life. This summer, it emerged that Facebook used the technique to experiment on users, without their knowledge, in an effort to influence their emotions. And more recently came the revelation that dating network OkCupid lied to some of its users about their suitability to be “matched” romantically with another member of the site. The company was hooking people up with unsuitable potential partners, and then tracking their interactions. So, at what point does all this experimentation become outright manipulation? Facebook experiment criticized Facebook is facing criticism after it emerged it had conducted a psychology experiment on nearly 700,000 users without their knowledge The internet is one big experiment, and you’re part of it. Every day, millions of trials are manipulating what you see when you browse online, to find out how to keep your attention, make you click more links – and spend more money. And these experiments are often secret. You’ll probably never know you were part of them. This is all thanks to something now well-known in the tech industry, called A/B testing. It means that the web pages served to you are not necessarily the same as those shown to the next person – they might have slightly different colours, an alternate headline or, on social networks, you could be shown different personal information about your friends and family. The phenomenon of A/B testing began as a relatively benign, even mundane, way of improving websites. It’s largely used for something called “Conversion Rate Optimisation” (CRO), which is a measure of how well any website is able to engage users. What has made it so powerful, however, is that sometimes it throws up results that nobody would have predicted otherwise. Earlier this year, for example, a Google executive revealed that using a different shade of blue for advertising links on search result pages caused more people to click on those links, boosting the company’s revenue by $200m. Similarly, travel site TripAdvisor has used A/B testing to discover that certain colours draw some people in more than others. If people have arrived on a TripAdvisor page from a Google advert, for example, they’re more likely to click on a blue button. Other users navigating from within the TripAdvisor site, however, prefer yellow. Some results are even more baffling. The dental referral service 1-800-DENTIST, for example, recently trialled a variety of photos to encourage visitor engagement on their website, along with other tests. They found the one that won was a dentist with his hand on a female patients shoulder, which was unexpected because it’s something a dentist would never do with a patient, says Dan Siroker, CEO of A/B testing company Optimizely, which advised 1-800-DENTIST. As Stewart Ulm, director of engineering at travel search engine Kayak, puts it, sometimes you don’t know why an A/B result works – it just does. “We try to come up with theories to explain the results that we get, but when we’re doing our experiments with just pure statistical analysis we never really know for sure.” 👆 Ok... this is a VERY interesting article!!! but too long to post here, read at link❕ ⛔ ⚠ P.S. I fully hold the coders and Hacktivists responsible for coming up with a solution to these problems. An algorithm needs to be designed that can detect when these experiments are happening in real-time and from where they are originating. Elements of a website you are visiting are being altered to be presented to you differently than they should be normally, or even wrong info presented on purpose (like being matched up with people who are NOT your actual dating matches on OkCupid - just so they can see what happens) which is whats going on in these cyber mind-f*Ck sessions.. Like Ive said before, we are a bunch of E-GuineaPigs and these people have nothing better to do than poke a prod us with sticks and jot down the results. Who funds these studies, anyway?!
Posted on: Sat, 02 Aug 2014 01:53:18 +0000

Trending Topics



Recently Viewed Topics




© 2015