Invisible Pen — make any screen a touchscreen without special hardware

Year 2020 demanded big changes in our life style, especially in getting things done remotely. Networking and collaboration took a big hit due to this. The technology simply isn’t there yet to replace in-person human interactions for everyday tasks. One such area is the ability to express ideas freely on a whiteboard either during a meeting or for teaching. With just computer screens and the default hardware like mouse and keyboard a teacher cannot draw something easily to explain a concept better if it wasn’t clear from her slides. But as the famous saying goes, necessity is the mother of invention and people did come up with creative ways to solve this,

a teacher creating a projecting her handwritten notes by placing her phone on an elevated platform
A student using a compact disc as a mirror to project their keyboard over a zoom video call
another teacher writing on a mirror with a bright marker so that students can see both his face and what they are writing

All the above solutions work great and are extremely economical but the overall experience for both the presenter & viewer is suboptimal. On the other end of the spectrum there are expensive solutions like Apple Pencil on iPads, touch screen monitors or custom hardware like Airbar. These work great for people who want to and can afford them. Can we have a middle ground?

Coming from a computer science and applied machine learning background, I thought what if we could use phone & laptop together, which is present in most households these days, to solve this problem. That is, my ideal solution would be something that satisfies all these —

Invisible Pen is one such attempt. While I didn’t solve it completely (and am looking for any like minded folks to work with), here is how the current solution works,

Invisible pen setup
  • The mobile device which is mounted on the tripod tracks the live hand movements of the presenter / teacher
  • A server on the laptop receives the coordinates of the index finger from the mobile device over a local secure connection
  • The server translates the finger coordinates to pixels on the screen and moves the mouse pointer thereby tracing the presenter’s drawing on the screen

Note that the same idea works directly on the laptop screen and doesn’t require any additional monitor per se.

As we directly control the mouse, this setup can be used beyond rough sketching use cases — e.g. flip the pages of an e-book or a PDF document

Tech Stack

  • On-device hand tacking is done by handpose, as a Tensorflow.js model
  • Front end business logic is written in React.js & Next.js
  • Predicted hand coordinates are sent wirelessly to a Python Flask web server using, which is on the same local network
  • Our custom translation logic written in Python converts hand coordinates to screen pixel coordinates
  • pyautogui and pynput Python libraries control the mouse and keyboard of the laptop

Currently the mouse is moved only if the user presses a (customizable) hotkey on their keyboard, but this can be simplified / automated in the future, if need be. The entire source code will be made available on Github soon.

On-going work

We just got started and only proved that this setup is usable and convenient. There is a long way to go with a ton of exciting computer science challenges to be solved, specially in these areas

and more. Comment below if you would like to collaborate on this open source work!

Applied Deep Learning Engineer | LinkedIn

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store