Yes. In a normal React web app you can access the camera, location services, the file system and more
Did you ever think how awesome it would be if you could keep coding with React and web technologies and somehow still get your apps onto the Apple App and Google Play stores with native functionality? You wouldn’t need to learn Swift or Kotlin. Nor would you have to build the same app multiple times. I am going to show you a simple seamless way to do this.
First open up a command line or in VSCode open your terminal. Let’s add our packages.
npm i @capacitor/core @capacitor/cli
Capacitor is a library that allows any browser based web application to access hardware services on mobile devices. This means that we can keep writing our code not only using React, but also continuing to use url routing, through React Router, and other familiar React tools like Redux. That’s right, nothing changes in your workflow. The only difference is through Capacitor you now have access to device hardware and are able to create apps for the Apple App store and Google Play store.
Now that we’ve installed Capacitor we simply need to turn it on for the mobile platforms that we care about in our project. In the interests of time I’ll keep this story for iOS devices only, but Android works just as well. To enable Capacitor for iOS run these commands. Note when running init you’ll be prompted for an app name and id. You can use whatever you like since this is just a test.
npx cap init
npx cap add ios
So what we’ve done here is enable Capacitor in our project and created a folder that has all the iOS related files and assets. You will never need to touch these files, they are auto created for you by Capacitor and then copied to your Mac’s XCode when you’re ready to build your iOS project. Let’s make one small change before moving on. Open the file capacitor.config.json and update the webDir variable to be “build” instead of “www”. Since our React project saves its production build into the build folder we need to let Capacitor know this. Now I’ll show how to add camera capability. Open the App.js file and update it like this.
As you can see the @capacitor/core package is imported with some properties. The most important one, Plugins, is used to access the camera. We use a useCallback React hook to initialize the camera with some default configs onto a simple interface with just one button to trigger the camera. Let’s run this on our iOS device by running the Capacitor cli commands to move our code over to XCode.
npm run build // need to build our project successfully first
npx cap sync // need to sync the ios folder into our XCode project oncenpx cap copy ios // copies all our build files into XCode
npx cap open ios // opens XCode project
You might find the addition of these commands cumbersome, but realize what’s going on here. Since we are building a native app we need to compile our app into a native iOS binary. Obviously the only way to do this is with XCode. So that’s what these steps are for. Now run the project on your iOS device and since we have no UI, you will see only one button that says camera. If you click on it you will see the camera app that looks like this.
Sweet we now have a working camera with almost no effort. Now let’s save the photo onto the filesystem so we can view it later. Add this code into the App.js file.
You can see we are now using a useEffect that reacts when the photo object is changed. This photo object takes the immediately taken photo from the camera. Now the thing with photos is that whenever a picture is taken it is first saved into temp storage and may be deleted by the device at some point in the future. So we need to move it into a permanent app directory. This is where Capacitor’s filesystem API comes in to help us access the native filesystem and read and write files. So the next thing we do in the code is we first read the file from its temp location with readFile, then we write it with writeFile into a permanent directory, and once written we retrieve the final path and then convert that filesystem path into a url — so that we can pass it to the img tag’s src attribute later. Note we also updated the JSX for alignment. Before looking at the screen again, let’s add code to save these image file names into local storage so that we can retrieve the images and display them into a list.
There’s quite a bit of new code but the main thing to see is that we are using Capacitors Storage API to retrieve and set the local storage, the get and set methods, with the names of the photo files. Notice also I’m using moment to make the file name a little cleaner. The key thing to pay attention to in the code is that the convertFileSrc function is required to convert the file path from a system file path to a url. Or else img tags will not recognize the path and they will fail. Here’s what the screen now looks like.
You can see not only the last taken pic on top but a list of all previous pictures beneath the camera button. Awesome we created a React web app that works on mobile devices as a native app. Now let’s do one more thing and add locations for each image taken. Update your App file like this.
All we did was grab the Geolocation object from Plugins on line 19. Then we use that object in the function getCurrentPosition, which get’s the coordinates. Then we call that function later when building the array of photos and pass the latitude and longitude together with the file name starting at line 46. We then later add the file name and coordinates as labels onto our picture list. On the screen it looks like this. Note the prior images wont have coordinates since we hadn’t taken them at that time.
Let’s think about what just happened here. You took your existing React and web skills and created a native app. No Flutter, PhoneGap, Java, Kotlin, Swift, etc. No need to write the same code two or three times. No need to build a quasi web app using React Native and React Native Web … You’re welcome.
As always if you like helping other devs try DzHaven.com