• Click2Speak was founded in 2013 with the vision of revolutionizing the way individuals with severe physical impairments communicate. Focused on its vision to empower people through a new on screen keyboard, Click2Speak works in close collaboration with end users and professionals to offer a cutting-edge assistive technology on screen keyboard. 
    It all started in 2009 when Gal, the driving force behind this inspiring project, was diagnosed with ALS (Amyotrophic lateral sclerosis, also known as Lou Gehrig’s disease). 

    Gal: 
    “…My ALS started in my right arm and hand. ALS is a disease of the nerve cells in the brain and spinal cord that control voluntary muscle movement. My arm, hand and fingers gradually weakened, and as I am right handed, in order to operate a regular PC mouse, I had to use an armrest mouse pad, practically using my shoulder for control. For mouse clicking, I used my leg pressing a button on the floor. 
    As my disease progressed, controlling the mouse became very difficult and I began looking for alternatives. The first option was a head mouse, which tracks the natural movements of the user’s head into directly proportional movements of the pointer on the screen. This is a relatively inexpensive solution, costing approximately $500-1,000, but could be somewhat uncomfortable and only serves as a temporary solution as one loses head control due to the illness. Another option was a freeware solution utilizing a webcam, this was an interesting solution that probably works great for many individuals, but did not meet my personal standards as an avid PC user/programmer/gamer. 

    I decided to select the 'state of the art’ option – an eye gaze camera. Most modern eye-trackers use the center of the pupil and infrared / near-infrared non-collimated light to create corneal reflections. The vector between the pupil center and the corneal reflections can be used to compute the point of regard on surface or the gaze direction. In the PC environment, eventually, this is translated to pointer positioning on the screen (in a nutshell…). 

    Realizing how cost prohibitive these were (well over $1,000) and utilizing my extensive technical background, I decided to explore and build my own eye gaze camera using open source and purchasing the needed parts online (Camera, lens, infra-red illuminators and mini tripod). Thus, spending under $500 I had a working eye tracking camera to control the pointer on my PC, with 0.5 degree accuracy implementing relative error handling during calibration and adding dynamic calibration points. Happily, less expensive eye gaze cameras appeared on the market, and I decided to focus my energies elsewhere. 

    After being diagnosed with the disease, I contacted other individuals who suffer from ALS at different stages, and began to learn about the different challenges that I would face as my disease progressed. I also learned about the tech solutions they used to cope with these challenges. The most basic challenge was typing, which is done using a virtual on screen keyboard, a common solution shared by not only individuals affected by ALS, but a variety of illnesses such as brain trauma, MS and spinal cord injuries victims. The fully featured advanced on screen keyboards, again proved relatively very expensive (starting at $250), so I decided to develop the ultimate on screen keyboard on my own. Through the development process, my own physical condition continued to deteriorate and I reached the point of needing to use these cameras and on screen keyboards myself. I started with Microsoft’s 'ease of access’ keyboard that comes with windows. This is an acceptable keyboard and it has a reasonable prediction engine. 

    For my own development needs I purchased the developer version of TOBII’s eye gaze camera. This allowed me to code (with my eyes!) additional important features that were lacking in the Microsoft keyboard for eye control such as highlighted keys, virtual keys, auto scroll, right click, drag and much more.


    Gal sitting at his 'nerve center' interviewed by the BBC

       ​


    It quickly became apparent that using our 'powered by Swiftkey’ keyboard enabled me to work faster and more accurately. Friends who used other solutions prior to ours (not necessarily Microsoft’s) were delighted with the results, albeit a small sample size. 
    This started a new journey that introduced me to Swiftkey’s revolutionary technologies and how we customize them to our specific needs. I reached a first version of our keyboard and distributed it to friends who also suffer from ALS. They gave us invaluable feedback through the development process, and they all raved about its time saving capabilities and accuracy and how it makes their lives a little easier. Even Swiftkey’s 'Flow’ feature is translated successfully to this environment; basically, it replaces the finger when using Swiftkey on an Android device with an eye/head/leg when using a PC/Tablet/laptop + camera/other input device + our Swiftkey powered keyboard installed. 

    At this point I had my good friend Dan join me in this endeavor as I needed help with detail design, quality assurance, market research, project management, and many other tasks. We formed 'Click2Speak’, and we plan to make the world a better place! ...” 

    Dan: 
    “…Gal and I have been best friends for almost 20 years. I have followed his amazing career as a senior developer and CTO. Gal is a true friend, and one of the brightest minds that I have ever met. A fearless and seasoned entrepreneur. Of course, I was devastated to hear about Gal’s illness a few years ago. But Gal’s spirit during the tough struggle with ALS has never been broken. When Gal approached me around a year ago with the idea of developing a superior on-screen keyboard, that utilizes Swiftkey’s revolutionary features, for the benefit of the severely paralyzed - I was extremely intrigued. I knew that I would love to join Gal in this journey to improve the way people around the world communicate. I also knew that nothing would stop Gal from reaching our goals, not even something 'minor’ like a progressing ALS condition, I knew that Gal would move forward using his eyes, feet, whatever… when Gal has passion for something – nothing stops him…” 

    Combined with any high performing level hardware like eye tracking cameras, or head/foot mice, or even PC input devices that are based on nose sniffing, our 'powered by Swiftkey’ on screen keyboard software can help users overcome some of the communications difficulties caused by conditions like stroke/ALS/Cerebral Palsy/spinal cord injury, and make their PC input communication process faster, more accurate, and more fun.