Mobile HTM5 apps that can compete with native
Nolan Lawson
(Speaker notes are in the console.log)
Hi, I'm Nolan Lawson. I work for Squarespace doing
Android and mobile web, and in my spare time I
help maintain PouchDB, a JavaScript database.
HTC Magic, 2009
I've been doing Android development for awhile.
This was my first smartphone, and also the second-ever Android
phone. I thought it was pretty awesome back then.
I've written a lot of Android apps. But the problem with
all these apps is that they're written in Java, so they only
run on Android. Java used to hold the promise of being the
"write once, run everywhere" platform, but nowadays that
mantle has mostly passed to JavaScript.
→
→
- Android
- iOS
- Windows Phone
- Blackberry
- Firefox OS
- Windows
- OS X
- Linux
- Chrome OS
- etc.
So nowadays, this dream I've been tinkering with for awhile
is that I could write an app once, using HTML, CSS, and JavaScript,
and then deploy it to a variety of mobile and even desktop platforms.
Today this is 100% possible with tools like Cordova née PhoneGap,
Node-Webkit, and Atom Shell.
But the problem has always been performance. The conventional wisdom
nowadays is that you just can't get native-level performance from
web technologies. Facebook famously migrated their apps from HTML5
to native, Twitter followed suit, and many other apps became thick clients
or dropped the web altogether.
Nexus 6, 2014
But you know, it's 2015, and phones have gotten a lot more
powerful since my HTC Magic's heydey. So I think we can revise
these assumptions, and give HTML5 a second chance. I think that in 2015
you can achieve native-level performance for most apps you want to write,
but you have to know the tricks in order to do it.
- Animations
- Android
- Frameworks
- Offline-first
I'm going to talk about 4 main tricks.
Animations smooth as
So what do we want? Buttery-smooth animations.
We want animations like we're used to seeing on our mobile devices,
as we swipe left and right on our home screens.
Hardware-accelerated CSS animations
On the web, there are actually lots of ways to do animations.
But in this handy diagram, I demonstrate which ones I think you
should be using.
I know this is Brooklyn JS, but I'm here to tell you: please stop
using JavaScript animations. No more jQuery animations, no more
Y.anim; it's got to stop.
JavaScript animation (left) vs. hardware-accelerated CSS animation (right)
Look at this comparison of JavaScript animations (using Y.Anim) and
hardware-accelerated CSS animations, using the same hardware. They're
just not in the same league.
That's because JavaScript is bound to the event loop - the same
event loop that can't guarantee that setTimeout() will actually execute
that many milliseconds later. This is not what you want for your complex
bezier curves and easing functions.
By comparison, hardware-accelerated CSS animations convert the
DOM node into an image, then animate it on a separate layer using the
GPU. It's doing what GPUs were built to do, which is push pixels around.
It's off the CPU, it doesn't require a re-paint, and it's very fast.
translate3d
scale3d
rotate3d
opacity
OK, so how do we get these cool animations? First, you use CSS
animations - either keyframes or transitions are fine. But you have
to be careful to only manipulate one of these 4 properties. If you
try to manipulate anything else, then you won't get hardware acceleration.
There are some exceptions - e.g. the regular translate will also give
hardware acceleration in some browsers, so you don't necessarily need
translate3d. But these are the four to stick with if you want to play it safe.
top / left
top / left / translateZ(0)
translate3d
Run JS for 1 second
Now if you don't believe me, and you go running off
using CSS animations without using transforms/opacity,
you'll get something faster than JavaScript animations,
but still not as fast as hardware-accelerated animations.
(And thus not as fast as native.) Also, the non-GPU animations
are blocked by JavaScript and the DOM!
In the links at the end of these slides,
I have a Jake Archibald video where he goes over all these different
techniques and shows exactly what the performance looks like on a Nexus 7.
It's a great example of how you can fool yourself by only testing on
your desktop machine, because notice how those snakes look fine on
desktop but horrible on the tablet.
I also tinkered with his demo and attached it to these slides.
The Android problem
Now, I'm going to move on to some Android-specific advice.
Android is the new IE
Created with Highcharts 4.0.4Chart context menu% users and HTML5 Test score by version2.2: 1872.3: 1874.0: 2724.2: 2784.3: 3844.4: 4285.0 (estimated): 497Highcharts.com
If you want to know what Android web development is like, imagine a parallel
universe where instead of IE being the non-evergreen browser everyone hated,
it was Chrome. And every user had a different version of Chrome, so you had
to worry about bugs in Chrome 37 that didn't appear in Chrome 36, or you had to
worry about bugs introduced by Samsung and HTC that only appeared on their phones.
That's Android.
These are the HTML5 test scores for the WebView in each of the recent
major Android version. If you test your desktop browser (Firefox or Chrome), it'll
score around 500. Only Android 5.0 Lollipop actually scores that well, because
it's actually, finally, an evergreen WebView. That's the one you want, and it's less
than 0.1% of all users.
The table is included here for accessibility.
Android
2.2
2.3
4.0
4.2
4.3
4.4
5.0 (estimated)
Score
187
187
272
278
384
428
497
Coverage
0.035%
7.8%
6.7%
39.5%
6.5%
39.1%
0.05%
Luckily, we finally have a solution to this problem. Intel has put out
a free, open-source library called Crosswalk that allows you to basically
bundle the latest version of Chromium with your app. Then you can use
that instead of the WebView. It gives you a consistent target to code to,
and it works back to Android 4.0.
Oh, and did I mention performance?
Android 4.4.4 WebView (left, Chromium v33) vs. Chrosswalk (right, Chromium v38)
This is the same app running on Android 4.4.4 WebView vs the Crosswalk XWalkView.
It's not an old version of Android at all - it's the one right
before Lollipop - and yet the difference in performance is huge. I've seen old
4.0 phones that outperform 4.4 just because they're running Crosswalk.
Crosswalk adds about 20MB to the download size of your APK, because you're
basically giving your users Chrome, but for what you get,
it's definitely worth it. You get the Lollipop experience all the way
back to Android 4.0, which covers nearly all Android users.
This is probably the single best thing you can do to improve the performance of
your hybrid app. I can't recommend it enough.
Pick your frameworks carefully
This final point is about frameworks. Now I know we all love our MVC and
CSS frameworks...
But if you're doing mobile development, you can find the performance a little
underwhelming. This is not a jab at any framework in particular - we all care
about performance, but I'm just saying that mobile isn't necessarily their
number-1 concern, so we have to be cautious and try stuff out in advance.
Mobile-optimized frameworks
And if you're really worried about performance, you can try one of the many
mobile-optimized frameworks. There are a bunch of these out there, and they all
have different tricks to get better performance on mobile.
Angular ng-repeat (left) vs. Ionic collection-repeat (right)
For example, here's something Ionic has done. On the right, you have Ionic's custom
collection-repeat directive, which is a replacement for the standard
Angular ng-repeat directive, i.e. the thing that shows a list of items.
And you'll notice that the Ionic one is way faster than the Angular one, and
that I can actually flick through the list of items just as fast as I could on
a native list.
How they did this is actually really interesting. They just used the same trick
that the native UICollectionView (iOS) or ListView (Android) does, which is that
they only render those DOM elements that are currently visible. So now we finally
have smooth, fast-scrolling lists on the mobile web!
Last but not least, I want to talk about offline-first.
The basic idea of offline-first is that you want to cache aggressively,
and prefer talking to the local data store rather than the network.
And the reason you want to do that is that the network is
slow and unreliable. And every time your app
needs to make a network call before it can proceed, your users
have to stare at a spinner. This can slow down your app a ton, and furthermore
it means your app can stop working when your users go on the subway or
a far corner of the building with poor reception.
You can make lovely spinners with CSS animations that give beautiful
animations, but people aren't using your app to look at a spinner. Everything
I said before about smooth animations and crisp UIs doesn't matter if
your users are staring at a spinner.
This technique is important for desktop, but it's especially important
for mobile, because of just how slow cell networks are.
Ilya Grigorik - Breaking the 1000ms Time to Glass Mobile Barrier
This video by Ilya Grigorik is a must-see. The basic takeaway is that
mobile networks are slow, especially when it comes to latency, and 4G is
not going to save us.
Furthermore, these numbers he points out (100-400 milliseconds) are the best case.
If the radio needs to warm up, then this is a lot greater. And if the user
dips offline, then the latency shoots to infinity.
On the other hand, once a connection is open, you can actually download data
pretty fast. So much of the solution here is the same stuff we've done for
years - concatenating, minifying, spriting, and reducing the number of
network requests. But with offline-first, you can take it one step further.
Offline-first libraries
- PouchDB
- LocalForage
- YDN-DB
- Hoodie
- Lawnchair
- RemoteStorage
- MakeDrive
- IndexedDBShim
There are lots of libraries out there for doing offline-first, but I'm
just going to quickly show one way we use PouchDB at Squarespace in
our hybrid Android app for faster image loading.
pouchdb-lru-cache
+
blob-util
=
easy offline images
We have an open-source PouchDB plugin called pouchdb-lru-cache, which, when
combined with another library I wrote called blob-util, gives you an easy
way to cache images offline.
And this is what it looks like. On the left, you have images
loading normally, using a throttled wifi endpoint we use
to simulate a slow 2G network, and on the right is the second
time around, when we load the images from PouchDB. Blink
and you might miss it!
Loading images on a slow connection (left) vs. Cached images in PouchDB (right)
var db = new PouchDB('my-images');
db.initLru(5000000); // store 5 MB max
function getImageSrc(src) {
return db.lru.get(src).then(function (blob) {
return blobUtil.createObjectURL(blob); // cached
}).catch(function () { // not cached
return blobUtil.imgSrcToBlob(src).then(function (blob) {
return db.put(src, blob); // cache it
}).then(function () {
return src;
});
});
}
Basically you can think of this as a poor man's Service Worker. We are
overriding the browser's built-in cache, which normally would be cleared
when a user closes our app. Instead, we are setting a max limit of
how much data we want to store, then we are downloading images and putting
them in the cache if they aren't already there. Then if they're cached, we
can just show an image instantly instead of waiting for it to load all over
again. It's an LRU, so the least-recently used images get evicted from
the cache.
We actually go one step further and maintain two caches - a hi-res one
and a low-res one. The low-res one is essentially infinite, so that means
that, even if a user is offline and their hi-res image got evicted, we
can still show them a lo-res image. Basically, if they've downloaded
an image once, they'll never have to download it again. This gives them
a faster experience whether they're online or offline.
- Harness the GPU for animations
- Use Crosswalk for Android
- Choose your frameworks wisely
- Go offline-first
Summary