Qualcomm Developer Network August Developer of the Month is Eugene Panich from Almalence with offices in Austin, Texas USA, Israel and Russia. Eugene is the CEO and co-founder of Almalence and is well known in the industry for his understanding of image computation solutions and systems.
Almalence, Inc. was founded in 2005 and is focused on harnessing the computational power of today’s mobile devices to overcome the physical limits of their cameras and other optical systems. Their SuperSensor, Digital Lens for VR, and mobileDSLR products provide cutting-edge image computation solutions that augment and enhance image quality for still image and video capture on mobile devices and enable realistic and immersive experience via high picture quality on VR devices.
How was Almalence started?
The whole thing was pretty much unplanned. After an informal discussion over coffee about how to bring super resolution methods into end-user products, my partner and I found ourselves researching the subject. Much to our surprise, the results suggested a very promising technology. It took us a while before we brought our first product to commercialization, but we soon started generating some revenue and our product was featured in a professional photography book [Photographic Multishot Techniques by Juergen Gulbins and Rainer Gulbins].
What can you tell us about the products you develop?
Optical systems, such as a camera or a VR display, have image-quality limitations due to design constraints and the laws of physics. For example, since a smartphone camera must be small, it cannot collect enough light to produce a clean image in the dark nor can it implement a lossless optical zoom. As another example, a VR display lens must be thin and lightweight which makes it prone to aberrations and blur.
Top: frames from video clips taken with a smartphone. Bottom: same smartphone with SuperSensor(images provided by Almalence)
Almalence’s image processing technologies allow the imaging systems in such devices to deliver better quality than would be possible through the hardware alone. Our video SuperSensor for example, simultaneously provides lossless zoom, super-resolution enhanced low-light performance, and dynamic range for smartphone cameras, while our Digital Lens for VR helps to optimize display performance for today’s pixel-dense VR displays.
Sometimes our technologies lead to the design of new products which would not be feasible without supplementary processing. For example, our mobileDSLR uses a combination of computational, optical, and mechanical solutions designed to deliver DSLR-quality using a smart phone’s camera.
Where does your team get inspiration?
Our team is inspired from our quest to make technologies that nobody else has done before at a high level of quality. Having started from scratch with no strategic partnerships or funding, our team is proud to have won business with top smartphone OEMs, leaving behind more powerful and well-established competitors. On a more personal level, I’m truly inspired by our team of super-talented senior-level engineers.
What Qualcomm processor are you using in your products?
We use the Qualcomm® Hexagon HVX™ to run our algorithms as fast and as power-efficient as possible, which is crucial for smartphones and untethered XR HMDs.
As an example, our video SuperSensor is based on combining multiple frames and is very computational-heavy. Before the appearance of vision processors like the Hexagon HVX, SuperSensor could only process still images but not video. Any attempt to run it on the CPU or GPU for video would result in low frame rates and overheating. But running the same algorithm on a Hexagon HVX provides normal video frame rates while the power consumption is a fraction of that consumed by the camera subsystem.
Do you plan on using Qualcomm technologies on future projects? If so, how?
Yes. Qualcomm Technologies develops integrated platforms combining fast and power-efficient vision computation capabilities with a rich set of functionalities. This is what makes those platforms a great option on which to build our technologies. We plan to implement a demo of our VR technology on a Qualcomm 845 VR reference device which features the Hexagon HVX as the computational engine for our needs, and offers eye tracking functionality which is crucial for our technology to work.
Overall, we’ve found that vision processors such as the Hexagon HVX have opened up new doors to implementing features and products that were previously unfeasible. That said, I think we’ve just scratched the surface as to the potential of this new kind of computational engine. I hope to see many more new and cool solutions in the future.