Fingopay meets React Native.

Finger-based biometric platform FingoPay approached us to help them with their high-tech biometric payment system earlier in 2018 and, needless to say, we’ve had a blast helping them creating a proof-of-concept digital product that tapped directly into their finger-scanning technology using React and React Native.

Fingopay has developed a sophisticated Vein Recognition scanner (patented) used as a gateway for a payment system they wanted to implement in Pubs and bars around London. You can find out more about their technology here ⬇️:

Fingopay | How Vein Recognition works

The project goal was to improve & simplify the onboarding process, so we did a 1-day hackathon to build a functional prototype for the client.

Since we only had 1 day and the client only had a web plugin to connect to the finger scanner, we built the prototype using React web. We showed our fast-build working web prototype to the client and they LOVED IT!. This is how we started our two 1-week sprints in the pursuit of building an effective onboarding process.

We approach projects in a very “Lean” way (dah!!, it’s in our name XD), this means our process consists of quick test-build-learn cycles. The cycle always starts and ends with the User, and thankfully we have one of the best UXers in the business working with us! If you want to know more about Paul’s part on the project you can find it here

This is how the User flow looked like

Let’s talk about the technical challenges

Since we are working with hardware, we started by looking for the best approach to create a maintainable and secure way to connect with the finger scanner, and here’s where React Native comes into play.

React Native provided the smooth UI the client required and enabled us to connect to the Android tablet that hosted the C++ biometric scanner plugin.

The Android plugin worked in the following way. It turned the scanner on, scanned the finger veins of the user, and produced an encrypted code that contained the information of the finger veins pattern in question. Then the encrypted code was sent to an API where a matching algorithm resolved the finger to a registered user.

When building products in the digital identity space there are very important features you need to support, such as facial-recognition, scanning documents like IDs or passports, credit cards, etc. React Native was the best solution for the job since all the features we needed to build were accessible via native modules. And since we are React lovers, it was a win-win situation ;)

Native Modules: talking to the finger scanner

The way React Native lets you communicate with native modules and APIs is via a Bridge. Imagine this as a gateway between the JS thread (your React Native app) and the UI Thread (platform native thread). All the React Native APIs use it in order to send instructions to the native part, and create your UI, for example (using the Yoga Layout engine).

Since there’s no Native module for Fingopay’s finger scanner. We built one and enabled a React Native app to connect with the biometric scanner, which was super cool! If you are curious about how we did it, we simply followed the amazing documentation instructions in the React Native docs.

The most important part of the Native module integration is this part:

This piece of code is the way you “expose” this method to be called from the React Native side (JS Thread).

Here’s our Native module working:

Now with the Native module ready, we were able to start a scan, read data, get the result and stop scanning from our React Native app 🎉🎉

The Credit Card Recognition

With the first (and most important) feature built, we needed a way to easily and safely scan the user’s credit card to associate his/her payment information to its newly created unique-finger hash. We tested some Services and we ended up using card.io for credit card recognition owing to its support for React Native.

The hardest part was to set up the camera in a way that was able to scan the credit card with low light since the space where the tablet’s going to be is in a Pub, but nothing complex if you are familiar with Linking Libraries into your app. Usually, the Library you are linking includes specific instructions on how to do it (well, most of them 😉)

ID Recognition & Face Detection

Because we are dealing with Payments and Alcohol, legal issues needed to be taken care of right from the outset, and one important challenge that automatic payment systems have is certifying that the person “doing” the onboarding process, is the “actual” person from the ID & credit card provided.

To enable this we implemented one more step in the process that takes a selfie of the person doing the onboarding and matches it with the picture in the ID provided. Initially, this process was done manually, but everything was prepared (all the automatic services were contacted and tested) for when the product went live.

Result and Next Steps

Our UX expert Paul doing some User testing

The whole process of building and iterating took us around 1 week total dev time, spread across the two 1-week sprints. We did multiple testing sessions and a lot of trial and error until we had a perfect MVP for the client to sell the solution to their partners. Total win for us and our client ❤️

Here’s the final prototype (screens) we built:

If you have any technical questions about the stack mentioned please send me a message, leave a comment here, or reach out to us via social media!

Thanks for reading 🙏 🙏 🙏 🎉 🎉 🎉

Looking forward to hearing your thoughts and insights!!

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Horacio Herrera

Horacio Herrera

Freelance Designer/Developer. @gquiroga31 husband. Web. Javascript. React. React Native. GraphQL.