Audio recordings and synthesized speech are useful tools, but they can't replace the ability to read and write.
If the point of a journey is the destination, then we must conclude that reading and listening to audio recordings and speech synthesizers accomplish the same thing. If we boil reading down to one of its functions, acquiring information, then it’s extremely practical to have lots of ways of getting that information.
But in making a statement such as “I don’t think it makes any difference whether you read or listen,” or worse, making that decision for the people over whom we have some influence, removes not only the power of choice, but equally important, it shuts the door to a crucial means of advancing brain function that enriches our cognitive ability.
When studying for a test or doing research, we process language the same way as listening to synthesized speech screen readers, but we also decode and translate that language to various parts of our brain (which explains why many people find it helpful to print a complex set of instructions or read it with a braille display.) Relegating all reading to VoiceOver or Jaws for all people – without consideration of the person or task - impoverishes us in another way: if we’re limited to audio recordings and voice synthesizers, the simple act of identifying a tin, noting a phone number on the back of a business card or writing a note become adventures in device batteries, headphones, ambient noise, and time.
Braille enables a seamless relationship to the world around us, because, like it or not, sighted people view access to the printed word, printed scribble, printed number, in short, literacy, as a right, not a privilege, for those living in the 21st century -- and so should we.