AN objective look: iOS vs. Android for the visually - TopicsExpress



          

AN objective look: iOS vs. Android for the visually impaired!!!!!!!! an another intresting post for u dear friends. please read and understand before bying a new tuch mobile. sourse: thebinaryportal.net/android_ios.html By Tamas Geczy Part I: Basic Features Introduction for years now, I have been exploring the depths of an ever changing mobile world, continueing to adapt with the pace as quickly as possible. After Windows Mobile was slashed in 2010, the two remaining players for the blind became Android and iOS. Nokia also abandoned its operating system a year later; therefore the remaining options were based off these touchscreen platforms. Unlike earlier days, both offered a greater degree of hope and promise. iOS began including VoiceOver, their screen reader starting with the iPhone 3GS and iOS 3.0. Android, meanwhile, started working on a screen reader called Talkback, which provided basic abilities, making Android 1.6 and above usable. At the time, this was limited to only the physical keyboard, and many found it a very cobbled together solution which gave terrible access to the screen. Fast forward to today, and we have a completely different outlook on the accessibility of mainstream technologies. Before we dive in to how these platforms compare to each other from an objective standpoint - that is, from the view that neither is better than the other - I must point out some of my background with experiencing both. I feel as though I can provide a non-judgmental view in my article here, and it is important that this connection is made with you, the reader, and prior to starting the process. My Background in the World of modern Smartphones I first began using Android in late 2010, when there was an opportunity to run it on my HTC Touch Pro 2 as a hacked-on solution. Since the phone ran Windows Mobile, this was far from the experience you would get on a regular device. With my friend Leo, we were able to create an image which came up talking from the beginning, and I distributed this in some places for others to try. This gave me access to Talkback and Spiel (a second screen reader) running on Android 2.2. Compared to Windows Mobile, I was amazed to how much more I could do with a smartphone, and I even dared to use it as a daily operating system. In February 2011, Verizon and Apple released the iPhone 4 on their network. This opened up the door to upgrade my two year contract with the same unlimited data plan I had from Windows Mobile to something different and stock. I used the iPhone up until October of that year, and adapted very well to the environment of only touch. Android was placed on the backburner, but I still wanted to try and experience Googles offerings from a dedicated device. Thats how the Droid 2 came into play, both in my life and as part of my Play store account. Pun intended there! I purchased it used for $250 on EBay, and decided to switch to it as a full-time operating system. Back when Gingerbread was rampant across the OS, but right before IceCream Sandwich (Android 4.0) was released. I came just in time for a highly experimental expedition: Only 3 months after I got my phone, some hackers on a forum called XDA-Developers were able to create a working copy of Icecream Sandwich for that phone. With a little help and skill, I was up and running Googles latest sweetness on my Droid 2 Global. This release, for the first time, allowed the use of touch screens, providing me with a direct line of comparison between the iPhone I still had and what was new at the time. I switched back and forth between Android and Apple as often as a Dos user would go from Windows back to MS Dos. My use of the iPhone for almost a year meant that I loved the simplicity of the platform, yet my tinkering side did not let me put down the Android phone, which gave a large degree of customization iPhone users could only dream of. Even with jailbreaks, iOS users could only do so much, which did not include modifying the core OS running on their devices with the use of custom roms. I upgraded to a Droid 4 in September 2012, knowing that Jellybean was a huge improvement in the way blind people could use their phones. It introduced such things as swiping between items, double tapping them just like on an iPhone, and finally, Braille displays became an option. I decided that giving the latest Android release a try was important. Again, I turned to the XDA community, and sure enough, within 3 months of having my new phone, the option to run Googles latest 4.2 OS became a reality. Now, I am back to an iPhone once more. I sold my Android phone, choosing to leave my Nexus 7 Android tablet for a way to test anything new that might happen on the other side. Using an iPhone 5 has truly allowed me to gain a greater understanding towards how both Google and Apple operate, and what they can both offer on the table in terms of usability for people like myself who need speech and Braille to read the screen with on a daily basis. Returning to an iPhone has also shown me what aspects of Android I do miss, and which I wanted to have when I was using it. Aptly, my goal is to provide you with this experience, and start from the ground up on covering where Google and Apple can both provide advantages to accessibility. Obviously, my use of apps will significantly differ from another persons, so Im including some experiences I have tested just to cover more basis and ground for those who might need different things on both of the platforms. For now, let us begin from the very basics: How do these two differ in the way screen access is achieved? Accessibility from Two different Philosophies The gap which existed in the way both Apple and Google implement their screen readers has slowly closed, allowing for a better and less jarring transition from these platforms. Apple, by nature, provides a more closed-off approach, which is reflected even by the numerous gestures available for VoiceOver. Google, on the other hand, has a highly transparent approach, whereby the use of your Android device is similar to how a person with sight would experience it. This is evident through some of the gestures incorporated into Talkback, along with a plethora of other ways in which tasks can be completed. Initial Setup The way that both Android and iOS operate from the start are different. On an Apple device, you can triple click the home button to turn on VoiceOver, their screen reader. On Android, the gesture is a bit trickier. If running Jellybean 4.1 or above, a person can hold down the power button, feel a small vibration or hear a noise, and place two fingers on the screen slightly apart. This is a hit or miss gesture, and often times people make the mistake of still holding the power button, thus turning their device completely off (as on some, a hold of the power key is a force shutdown). Usually, the gesture can be repeated after hitting the power button to exit the menu that pops up. By placing ones finger on a different position on the screen, theres a more likely chance that it will work after the second or third time. The setup process between both systems is similar enough that details are not important. Accept for one thing: Passwords on Earphones Have a pair of earbuds handy? Good, because youll need them when entering in your wi-fi and Google passwords. Talkback and Googles accessibility is certainly more privacy focused, as when at a password field, all that will be heard are dot prompts. Even when sliding a finger around the keyboard, this is the only indication that keys are in focus, however finding out which is impossible. Once headphones are plugged in, passwords and keys are spoken properly. Alternately, I have found that skipping the Wi-Fi and Google account sign-in options is possible on most devices. Therefore, if you really have nothing handy, you can skip them both and go into settings>accessibility, where the speak passwords option can be enabled. This makes all password fields visible, converting them to regular edit boxes. Either way, privacy is out the door: On iOS, the letters you type are hidden with a click, but the keys you swipe through are not - therefore, a person could very easily hear which key you pause on and figure out a password. There is no good compromise here, and there might never be, considering that speech is not optimal for entering passwords. The Home screen This is where things become tricky. To be clear, I have used Google devices which were running Motorolas Blur interface, along with my Nexus that runs Androids own home screen. One problem many see with android is that manufacturers can customize the look and feel of their devices, to the extent of replacing their entire home screen and app offerings. This is the beauty of open source, but also a step-back for accessibility in some ways: If a company does not implement proper accessibility in those applications, they will be nearly impossible to use with Talkback. I know Samsung is very good with this, but HTCs sense interface is not. Many companies are moving towards offering a more original Google experience nowadays, such as Motorola, which has stripped out over 500 megabytes of bloatware in their latest Droid 4 update. Samsung and HTC both have the customize-until-you-cant-no-more idea, and this shines through clearly on both the HTC 1 and Galaxy S 4. Your mileage will vary with this one. Accessibility Gestures There are a few things here which I can clearly point out, concepts which I miss when using either platform. iOS does offer many more gestures for using VoiceOver. These include a three finger triple tap that turns on a screen curtain mode, a two finger triple tap for toggling speech, and gestures for a so called rotor - a clockwise dial movement which allows for the changing of navigational levels when swiping down. On both platforms, doing small left to right swipes will move you through the elements on the screen. The key difference? With Apple, an up and down swipe moves you through whichever item you chose with the rotor; On Android, it nearly has the same effect as a left-right swipe. Googles form of a rotor comes in a different form: Quick up and down swipes done in succession. This takes time to master for some. Essentially, you move your finger up slightly, and then right away back down. A lot of other gestures like this exist across talkback, such as ones for activating the home button. They are customizable in Talkbacks settings, so the user does have flexibility on which gestures they want for which pattern. This is a good thing. scrolling Scrolling is also very different on both of the platforms. On iOS, a three-finger swipe up or down scrolls. The higher the tone, the closer you are to the end of your list. VoiceOver only scrolls by a given amount, which is usually a full page on the screen. For example, the iPhone 5s screen contains 8 rows, so a scroll gesture with three fingers up or down will move the item count by 8. Android took a more visual approach, whereas iOS has nothing similar to how the sighted scroll. This visual approach is reflected both by those drawable gestures on Android, and this scrolling technique. Both are well done, though often times I find myself wanting to have Androids scrolling on iOS. Previously, talkback would not scroll the screen for you when swiping left to right. This is about to change with the upcoming talkback release, which implements an auto scroll feature - making Android be even that much closer to iOS. You will now be able to move left to write with swiping in lists and have the screen advance, similar to how VoiceOver advances lists on its own. Here, it is done by a specified amount, similar to iOS. Still, the two finger up and down scrolling method is being kept, and it is practical that the scrolling when swiping option advance by a certain number in your list. Besides this, there is not much more to Android and iOS gestures. On Android, theres an additional option on Jellybean, which allows for the single tapping of items. This was how version 4.0 (A K A IceCream Sandwich) worked, and a feature I personally used for tapping items. Instead of double tapping to confirm the item your finger was on, you simply tap once more at that spot, which will then activate it. This is another concept that makes Android feel more sighted, since it allows for activating explored to items with just a single tap. Stock apps, Stock Experience For this review, I will be focusing on my Nexus 7 device, which comes with the stock Google apps and Android which is original as can be. Both Android and iOS comes with apps that help you get started with the respective ecosystem you bought into. For Apple, this is the iTunes and App stores, and for Google, the Play store. Googles offering combines both music and apps into one store, whereas Apple separates the two by using two apps on the home screen. Your standard Calendar and contacts apps are also included. My goal here is not to review every single apps experience, but to provide where each is lacking what the other has from the start on a stock device. Weather And Stock Apps: A point for iOS Apple includes both a weather and stock app on their home screen. This was a feature I missed from iOS greatly, as I could never find a program which can replace the feel and Ease of use that they include. Cyanogen Mod roms include a News and Weather app, which can partially facilitate for the purpose of weather updates. Otherwise, there are numerous selections on the Play store which can even provide a widget on the lock screen. Speaking of those... Widgets: A point for Android Widgets are a great way to display information quickly on Android devices. Some use them, some dont. It is another personal preference, and I liked to be balanced with the use of my widgets. The more you use, the greater your battery drain and data usage, so over-using this feature is not a good idea. Unlike the iOS home screen, Googles is laid out completely differently. There is a dock of sorts at the bottom with icons, and above that, the ability to choose between 5 home screen pages. The apps icon is always present, and tapping it brings you to a grid of apps you can open. On the top of that page are other pages, which include... Widgets. Once you find a widget or app you like, you can simply tap and hold it and move it on to the home screen. After this you are taken out of your app drawer onto the screen you placed the icon on. To be fair, this experience is tricky on both Android and iOS for blind users. Android does not provide any feedback as to which screen you placed the widgets or apps on. iOS does give this feedback, but precision with moving icons is very difficult sometimes, especially on a cluttered page of apps. Once a widget is on the home screen, it will always stay there and provide relevant updates depending on what you chose. A weather widget, for example, will give weather conditions real time, while a stock widget might give you information on tickers you checked. Its a great way to gain information without opening some apps. iOS does have some widgets, which are located in its notification center. The notification center. A toss up experience One aspect where android and iOS are similar yet different regards the notification center. This is where all notifications go. As a blind person, it is very difficult to review the status bar on Android without hearing all of your notifications announced. It is not impossible, just difficult. When you place your finger on the top line of your screen, they are announced all at once, along with battery and signal status. Optionally, sometimes moving your finger around the line will speak individual elements, but I found this unreliable at best. You access the notification center by placing your finger on this status line, double tapping (or tapping depending on how you set this up in Talkback settings), and pulling down half way on the screen. Once you let go, the notification shade opens. iOS is more straight forward about this. As a VoiceOver user, you can simply swipe down with three fingers from your status bar. It can also read off individual items, such as signal strength and orientation lock. When the notification shade is opened, you are able to scroll left and write among them. They are not grouped into apps like on the iPhone, and generally one app will have only one item in the notification center. A “clear all notifications” button will dismiss all of them and return you to the app you were in. Optionally, you can dismiss individual notifications next to each app with a button. The other option is to simply perform a two-finger swipe right on the notification you are on. Sighted users also swipe write to clear single notifications in a similar fashion. One advantage Android has over iOS in this regard is the quick settings screen. Accessed through doing the same notification swipe from the top right edge of the screen, it allows for options such as toggling Bluetooth and Wi-Fi, changing brightness, all conveniently. While iOS 7 is getting this feature in the new control center option, right now there is no easy way to change these settings. iOS also has a more cluttered notification center. Each app is grouped into having its own set of notifications. A mail app will have all 100 unread messages you might have in your inbox, and a text messaging app will contain every unread message in the notification center. On Android, most apps will group their notifications into one item. The Google Voice app might say, “Google voice, 13 unread messages.” This can be both an advantage and disadvantage depending on your view point, however personally I do find iOS to have a more cluttered experience. This is also the place where Apple includes a Weather and stock widget, along with Post to Twitter/Facebook options. It is, right now, the only form of a widget you will ever find from Apple. Developers cannot create extra widgets to place in your notification center, either. Switching between the app switchers Both Android and iOS deal with multitasking in different ways. Google is less strict about how it allows apps to consume battery and data when in the background, whereas Apple only allows for certain tasks to be completed. This is about to change with iOS 7’s new multi-tasking features, but for now, apps can only do things such as play audio, use GPS, or facilitate a VOIP call. You double tap the home button to access the app switcher, and here you can hold an item by double tapping it and keeping your finger on the screen to remove the app from the list and from memory. To open a similar “recent apps” list on Android, just hold down your home button. On some devices this will be a capacitive or physical button, while on others like the Nexus devices, a soft key. Regardless, a recent apps screen will open, showing all apps that have been used. The gesture to remove these is more visual and might be intuitive: You double tap and swipe the app away from the screen to dismiss it. Alternatively, doing a hold gesture will bring up a menu allowing you to dismiss the app. When you remove an app, it is not removed from memory right away. In fact, Android tries to keep apps in the memory, which is a big disadvantage especially when listening to a music app. Want to use Spotify but are somewhere else and need to pause music? You could lock your screen and do it from the widget Spotify places on it, or you could return to the app. Closing it will leave you stranded with music, but the app not being in your list of used ones. To be continued… These are just some basics between both the Android and iOS operating systems. In Part II, I will examine productivity on both platforms, including Twitter and creating documents for college courses. This will also include reading e text books, and accessing entertainment applications. Keep in mind that because of the ever changing environment in technology, this review might not be relevant even in a six month time period.
Posted on: Wed, 09 Oct 2013 05:53:18 +0000

Trending Topics



Recently Viewed Topics




© 2015