Rosie's Resonance Chamber

TechForAll

My cousin Samira asked her mom how I’m managing to build a website when I’m print-impaired. It’s a great question — because from the outside, it looks impossible. From my side, it’s just a different way of thinking. I build in layers: 💻 Laptop — this is where I run the technical side. I use Cloudflare to manage my domain, security, and file delivery. Its dashboard is well-labeled and works beautifully with screen readers. I store my public files in R2, Cloudflare’s file storage, and share them with clean, direct links — no sighted steps required. 📲 iPad — this is my writing studio. I use the WriteFreely app for iOS to draft blog posts in Markdown, a simple, text-based way to format content. Instead of clicking bold or italic buttons, I just type bold or # Heading. Markdown is perfect for blind and print-impaired writers because it’s pure text — no visual editor to wrestle with, no formatting traps. 📱 iPhone — my editing and refining tool. I can update posts, fix typos, or check tags while I’m traveling, entirely by ear. Underneath all that runs the technology that makes it possible: • NVDA (NonVisual Desktop Access) on Windows, my primary screen reader. It speaks every line of code, every menu, every status message. I navigate with keyboard shortcuts instead of a mouse. • VoiceOver on iPad and iPhone, Apple’s built-in screen reader. It lets me explore the screen with touch gestures — a single tap announces what’s under my finger, a double-tap activates it. Together, they turn my devices into voice-driven control panels. I don’t look at my code; I listen to it. I’m also a self-trained junior-level developer, which means when I hit a wall, I know how to climb it. I research, experiment, and problem-solve using tools like DuckDuckGo, YouTube, Perplexity, and GPT-5. Accessibility doesn’t mean limitation — it means creativity through persistence. When I put it all together — Write.as as my site builder, Markdown for structure, VoiceOver and NVDA for navigation, and Cloudflare for hosting and management — I have everything I need to create, maintain, and grow my digital world. So yes — I build and manage a full website without reading print. My tools talk, I listen, and I translate sound into structure. Being print-impaired doesn’t close the door on web development. It just means I build by ear — and I’m damn good at it. #Accessibility #BlindCreators #WriteFreely #Markdown #VoiceOver #NVDA #Cloudflare #TechForAll #madamgreen #SelfTaughtDev

People sometimes assume that using Voice Control means I’m slowing down. The truth is the opposite — I use it to keep up. Being print-impaired doesn’t mean I lack literacy or drive; it means my eyes and brain process written information differently. So instead of chasing letters across a glowing screen, I command my devices by voice. I tell them what to do — and they listen. With Voice Control, I can: • Open apps, write text, and format posts faster than most people can drag a mouse. • Jump between windows, edit Markdown, and manage Cloudflare dashboards without ever touching a cursor. • Dictate and correct on the fly — the same way a sighted developer glances and types. It’s not about convenience; it’s about speed parity. Screen readers like VoiceOver and NVDA are powerful — they turn visual interfaces into sound. But some of their built-in workflows can be slower than a sighted person’s visual navigation. That’s where Voice Control bridges the gap. I can speak a command and skip several keystrokes or navigation layers that a screen reader alone would take time to announce. Voice Control doesn’t replace my screen reader — it accelerates it. It’s the missing rhythm section in an already-complex orchestra of tools. Sighted people rely on visual scanning to move fast. I rely on structured commands and muscle memory. Once you know the vocabulary of your device — “Open Notes,” “Click Upload,” “Press Return” — it becomes choreography. Voice Control levels the field. It lets me match the pace of my peers in meetings, projects, and collaborative spaces. I can think, speak, and act without losing time to visual fatigue or inaccessible design. For a print-impaired person, voice isn’t a crutch. It’s an interface — the one that keeps me in sync with a sighted world built around speed. I build, write, and manage the same way others do — just with sound as my keyboard and rhythm as my cursor. #Accessibility #VoiceControl #BlindCreators #VoiceOver #NVDA #TechForAll #madamgreen #RosieWrites

for blind and print-impaired creators who build by sound

🌍 What Voice Control Does Voice Control lets you run your entire device by voice — tapping, typing, navigating, and editing hands-free. It’s built into Apple systems, integrated through Windows Speech Recognition, and available via Google Voice Access on Android. For print-impaired and blind creators, it’s not just assistive tech — it’s a speed equalizer. It keeps pace with fast-moving, sighted environments by replacing visual scanning with direct commands.

💻 Enable Voice Control — macOS, iOS, iPadOS, Windows, Android

🍎 iPhone / iPad (iOS & iPadOS) 1. Go to Settings → Accessibility → Voice Control 2. Tap Set Up Voice Control 3. Follow the quick tutorial, then toggle Voice Control ON 4. You’ll see a blue microphone icon when it’s listening 🗣️ Say “Open Notes,” “Click Upload,” or “Scroll down.” To pause listening, say “Go to sleep.” To resume, say “Wake up.” 📘 Bonus: You can add Custom Commands under Settings → Accessibility → Voice Control → Custom Commands to automate tasks like “Open Write.as” or “Start new blog post.”

💻 macOS (MacBook / iMac) 1. Choose Apple Menu → System Settings → Accessibility → Voice Control 2. Turn Voice Control on 3. The mic icon appears in your menu bar — you’re ready Voice Control works system-wide: Mail, Finder, Safari, Notes, Markdown editors — all respond to voice commands.

🪟 Windows 10 / 11 Windows calls it Speech Recognition. 1. Open Settings → Accessibility → Speech 2. Turn on Windows Speech Recognition 3. A microphone bar appears on-screen 4. Say “Start Listening” to activate, “Stop Listening” to pause 🧠 For print-impaired developers: Pair Speech Recognition with NVDA or Narrator for full feedback. It’s slower than Apple’s system, but great for dictation, editing, and file navigation.

🤖 Android (Google Voice Access) 1. Open Settings → Accessibility → Voice Access 2. Turn on Voice Access Shortcut 3. Launch Voice Access from the accessibility button or by saying “Hey Google, Voice Access on.” When active, numbered labels appear over buttons and text fields. Say the number or command (“Tap 7,” “Scroll down,” “Go back”). Voice Access integrates with TalkBack, so you can combine speech and auditory feedback just like VoiceOver.

🧭 Core Commands (All Platforms) Action Example Command Open app Open Notes / Open Chrome Click button or link Click Upload / Click OK Scroll Scroll down / Scroll up Select text Select last sentence / Select all Delete text Delete that / Delete line Dictate text Speak naturally — include punctuation Undo / Redo Undo that / Redo that Pause / Resume Go to sleep / Wake up Copy table

⚡ Why Voice Control Matters Screen readers like VoiceOver and NVDA give blind users access to every interface — but their workflows can be linear and slower. Voice Control fills that gap. One spoken phrase can replace a chain of keyboard commands or navigation layers. For a print-impaired creator, that speed parity is liberation. It lets you code, edit, publish, and multitask at the same rhythm as your sighted peers. Voice Control turns accessibility into efficiency. Sound is my keyboard. Rhythm is my cursor. #VoiceControl #Accessibility #BlindCreators #VoiceOver #NVDA #TechForAll #madamgreen #RosieWrites