There are a lot of bad ads on the web today — ads that are slow, janky and don’t necessarily deliver great user experiences when they include annoying pop-ups or cause content to reflow on your screen. The net result: users install ad blockers, making it difficult for publishers to fund the content they’re creating, and hurting advertisers even when they use acceptable ad practices.
The AMP Project announced the open source AMP for Ads initiative in July to address this issue and ensure users have great experiences on the web with both content and ads. The initiative’s goal is to fix the foundation of digital advertising on the web, applying the principles of AMP to advertising and making ads faster, more beautiful and secure.
Since the announcement, AMP Ads has seen a lot of momentum. Publishers across the world like The Washington Post, The Guardian, and USAToday have been testing AMP Ads via DoubleClick for Publishers.
Today, we’re sharing news that underscores the progress on the initiative.
TripleLift brings speed to native ads with AMP Ads
Native advertising platform TripleLift now serves AMP Ads for publishers and advertisers using their services. Time Inc is their first publisher partner testing AMP Ads.
Time Inc’s VP of Digital Revenue Strategy and Operations, Kavata Mbondo says, “AMP ads represent an opportunity to fix key issues with regards to ad experiences on the web. In our tests with TripleLift, AMP ads are already more viewable, up 13% from standard ads on AMP pages. We’ve also seen corresponding improvements in CTRs and eCPMs.”
TripleLift also found that AMP ads load 6x faster and are 3x lighter than comparable standard ads.
TripleLift’s President, Shaun Zacharia says, “As a native advertising platform, we want to make ads seamless with the pages in which they’re served in. Now with AMP for Ads, we get to realize that vision while ensuring we’re delivering the better ad performance our partners are looking for.”
Cloudflare now provides ad verification services for AMP Ads
Unlike standard ads, AMP ads must be verified by an authorized signing service before being served to a page. This is important to ensure both a fast and secure user experience. Signed ads allow AMP to trust that the ads are properly created in AMP format. This allows AMP to coordinate the ad experience alongside other components on the page, for a fast and reliable ad experience.
Cloudflare, a leading performance and security company, announced today an AMP ad verification and optimization service, Firebolt, that improves the ad experience for any 3rd party ad network or publisher. Utilizing Cloudflare’s AMP cache, Firebolt helps to deliver faster, safer, and more efficient ads.
This collaboration with the broader ad tech community is a true testament to the open source nature of the AMP Project.
Not your typical banner ad
One of the key tenets of AMP Ads is to ensure that while it makes user experiences better, it doesn’t throttle the creativity of advertisers and publishers. As you can see from the demo of USA Today’s early tests, you can deliver AMP ads using really creative ad formats. Importantly, more creative functionality is being added by the open source developer community with every passing week.
And that’s not all. We’re also working with agencies and buying platforms to ensure they have the tools they need to both build and deliver these fast AMP ad creatives. Advertising teams at Google are also looking into ways in which they can auto convert ads to AMP creatives so that advertisers can get better marketing performance from faster, better ad experiences.
This is the beginning of an exciting journey but there’s a lot more work ahead and we need to work together to ensure that ads remain the lifeblood of the internet. So get involved, join the AMP for Ads initiative, share your ideas and participate in a new fast, beautiful and open ads ecosystem.
Shz.de is one of the top news publishers in Schleswig-Holstein, Germany’s most northern state. Each month around 1.5 million unique users access our content to stay up to date on local news. In 2011 we saw an increasing number of our readers viewing content on their mobile phones, so we produced a mobile app, which has been downloaded over 40,000 times.
After 4 years of minor updates, we decided to completely rebuild our app because it was difficult and resource intensive to keep up with app store update cycles and RSS limitation. Our plan was to use mobile web content inside the app which gave us flexibility and cost-effective development, but we still wanted to ensure content loaded quickly inside the app. Our team learned about the Accelerated Mobile Pages project through documentation on AMPproject.org and by inspecting examples on ampbyexample.com. We were excited by the possibility of having content flexibility across platforms with near instantaneous page load times. “Our belief is that anything less than instant kills engagement. Why not integrate AMP article pages in our native apps?”, explains CEO Nicolas L. Fromm.
We piloted utilizing AMP pages inside our smallest mobile app and were extremely pleased with the results. With only a few hours of additional development work, we were seeing articles load 4x faster than in previous app versions. Additionally, pageviews per session increased by 25% – achieving our goal of both speed and engagement. With these initial easy wins, we quickly rolled out the AMP format to our other apps.
Using AMP pages inside our app has provided fantastic improvements with speed and user engagement, but we’re not stopping there. We’re now exploring improving caching and offline support in our app. We look forward to these future enhancements and are excited to share our success with AMP. As Mr. Fromm said, “We’re in the business of creating engagement, building loyalty and monetizing quality journalism. Our main goal in building digital products is to create a frictionless and uninterrupted reading experience in our apps and news sites.”
Posted by Mario Lorenzen, Portal Manager, mh:n digital
I can’t believe it’s already been over a year since we started our quest for faster, friendlier web pages. Now that we’re out of the honeymoon phase, the AMP team is taking a hard look at where we are today, what’s to come, and the direction of the AMP ecosystem.
It’s easy to blaze ahead with the development of new features, but it’s infinitely harder to create a healthy, breathing ecosystem. To do so, we want to continue to involve all of you – the AMP community – in figuring out the right path together. How better to kickstart things than to meet up?
With that, it’s my great pleasure to invite you to our first-ever AMP Conf. First, the basics:
March 7th and 8th
At the Tribeca 360 space in New York
Live-streamed around the world
Two full days of talks and panels
Targeted at web developers & designers
Whether you’re interested in or already building AMP pages, building a platform to display AMP content, or want to contribute to AMP itself (yes please!), we want you to participate. Request a seat to in person, or join via the live stream on YouTube.
Not only will the AMP team talk about new, exciting features and components – more than half of all talks and panels will be from you, members of the AMP ecosystem. We’ll discuss:
The challenges and wins of running AMP in production
How to create better, beautiful and interactive AMP pages
How your AMP pages are distributed across platforms
How to monetize AMP pages and the innovation happening around ads in AMP
How you can contribute to AMP
We’ll follow up with a more detailed conference schedule by the end of January, and if you have any questions not covered in the FAQ, reach out to me (or email@example.com) anytime.
See you soon!
Posted by Paul Bakaus, AMP Developer Advocate, Google
The following was posted on Medium by Paul Bakaus, AMP Developer Advocate, Google.
Caches are a fundamental piece of the Accelerated Mobile Pages (AMP) Project, yet one of the most misunderstood components. Every day, developers ask us why they can’t just get their AMP pages onto some AMP surfaces (e.g. Google) without linking through the cache. Some worry about the cache model breaking the origin model of the web, others worry about analytics attribution and canonical link sharing, and even others worry about their pages being hosted on servers out of their control. Let’s look at all of these, and understand why the caches exist.
While AMP Caches introduce some trade-offs, they do work in the user’s favor to ensure a consistently fast and user-friendly experience. The caches are designed to:
Ensure that all AMP pages are actually valid AMP.
Allow AMP pages to be preloaded efficiently and safely.
Do a myriad of additional user-beneficial performance optimizations to content.
The Basics: Analytics attribution and link sharing
Even though the AMP Cache model doesn’t follow the origin model (serving your page from your own domain), we attribute all traffic to you, the publisher. Through the <amp-analytics> tag, AMP supports a growing number of analytics providers (26 to date and growing!), to make sure you can measure your success and the traffic is correctly attributed.
When I ask users and developers about why they want to “click-through” to the canonical page from a cached AMP result, the answer is often about link sharing. And granted, it’s annoying to copy a google.com URL instead of the canonical URL. However, the issue isn’t as large of a problem as you’d think: Google amends its cached AMP search results with Schema.org and OpenGraph metadata, so posting the link to any platform that honors these should result in the canonical URL being shared. That being said, there are more opportunities to improve the sharing flow. In native web-views, one could share the canonical directly if the app supports it, and, based on users’ feedback, the team at Google is working on enabling easy access to the canonical URL on all its surfaces.
With these cleared up, let’s dig a little deeper.
When the label says AMP, you get AMP
The AMP Project consists of an ecosystem that depends on strict validation, ensuring that very high performance and quality bars are met. One version of a validator can be used during development, but the AMP Cache ensures the validity at the last stage, when presenting content to the user.
When an AMP page is requested from an AMP Cache for the first time, said cache validates the document first, and won’t offer it to the user if it detects problems. Platforms integrating with AMP (e.g. Bing, Pinterest, Google) can choose to send traffic directly to the AMP page on the origin or optionally to an AMP Cache, but validity can only be guaranteed when served from the cache. It ensures that when users see the AMP label, they’ll almost always get a fast and user friendly experience. (Unless you find a way to make a slow-but-valid AMP page, which is hard, but not impossible… I’m looking at you, big web fonts).
Pre-rendering is a bigger deal than you think
If you take anything away from this post, it’s that pre-rendering, especially the variant in AMP, greatly outweighs any speed gains you could theoretically get by hosting directly from an origin server. Even if the origin server is closer to your users, which would shave off a few milliseconds — rare but possible — pre-rendering will almost certainly drive the most impact.
Perceived as much faster
In fact, pre-rendering can often save you seconds, not milliseconds. The impact of pre-rendering, as opposed to the various other performance optimizations in the AMP JS library, can be pretty dramatic, and contributes largely to the “instant-feel” experience.
Very efficient compared to full pre-rendering
If that was the whole story, we could just as easily pre-render AMP pages from their origin servers. If we did, we couldn’t guarantee that a page is valid AMP on the origin, and valid AMP is critically important for the custom pre-rendering the AMP JS library provides: Pre-rendering in AMP, as opposed to just pre-rendering an entire page through something like link prefetching, also limits the use of the users’ bandwidth, CPU and battery!
Valid AMP documents behave “cooperatively” during the pre-render stage: Only assets in the first viewport get preloaded, and no third-party scripts get executed. This results in a much cheaper, less bandwidth and CPU-intensive preload, allowing platforms to prerender not just the first, but a few of the AMP pages a user will likely click on.
Safe to embed
Because AMP Caches can’t rely on browser pre-rendering (see the section above), normal navigation from page to page doesn’t work. So in the AMP caching model, a page needs to be opened inline on a platform page. AMP Caches ensure that the requested AMP page can do that safely:
On top of the validator, the AMP Cache parses and then re-serializes the document in an unambiguous fashion (this means that it does not rely on HTML5 error correction). This ensures that browser parsing bugs and differences cannot lead to XSS.
In addition, the AMP Caches remove one important potential privacy issue from the pre-render: When you do a search on a content platform preloading AMP pages on the result page, none of the preloaded AMP pages will ever know about the fact that they’ve been preloaded.
Think about it this way: Say I search for “breakfast burrito”. If you know me well, you know I obviously searched for Parry Gripp’s song with the same name. But the search result page also shows me a couple of AMP search results from fast food chains that sell actual breakfast burritos. For the next month, I wouldn’t want to see actual breakfast burritos everywhere even though I didn’t click on these links (even though…maybe I do…mhh..), and an advertiser wouldn’t want to waste dollars on me for pointless re-marketing ads on all the burritos. Since AMP hides the preload from the publisher of the AMP page and related third parties, it’s a win win scenario for users and advertisers.
Auto-optimizations that often result in dramatic speed increase
The AMP Cache started out with all of the above, but has since added a number of transformative transformations (heh) to its feature roster. Among those optimizations:
Consistent, fast and free content delivery network for all content (not just big publishers).
Optimizes HTML through measures such as bringing scripts into the ideal order, removing duplicate script tags and removing unnecessary quotes and whitespace.
On the image compression side alone, Google, through its cache, is doing lossless (without any visual change, e.g. removes EXIF data) and lossy (without noticeable visual change) compression. In addition, it converts images to WebP for browsers that support it and automatically generates srcset attributes (so-called responsive images) if they’re not already available, generating and showing correctly sized images to each device.
Isn’t there a better way of doing this?
Look, I hear you. The provider of an AMP Cache is mirroring your content. It’s an important role and comes with great responsibility. If the cache provider were to do something truly stupid, like inserting obnoxious ads into every AMP page, AMP would stop being a viable solution for publishers, and thus wither away.
Remember, AMP has been created together with publishers, as a means to make the mobile web better for publishers, users and platforms. It’s why the AMP team has released strict guidelines for AMP Caches. To give you two interesting excerpts, the guidelines state that your content needs to provide “a faithful visual and UX reproduction of source document”, and cache providers must pledge that they will keep URLs working indefinitely, even after the cache itself may be decommissioned. These, and many more rules, ensure that a cache doesn’t mess with your content.
Most importantly, there’s plenty of room for more than one AMP Cache – in fact, Cloudflare just announced their own! With these AMP Cache guidelines released, other infrastructure companies are welcome to create new AMP Caches, as long as they follow the rules. It’s then up to the platform integrating AMP to pick their favorite cache.
From cache to web standards?
You just read about all the wins and trade-offs the AMP Caches do to provide an instant-feeling, and user friendly mobile web experience. What if we could get to many of the same awesome optimizations without the trade-offs, and without involving a cache at all?
Personally, I dream of future, still-to-be-invented web standards that would allow us to get there – to move beyond cache models (like a static layout system to know how a page will look like before any assets are loaded).
In 2016, we’ve done our first baby steps with the CPP, which turned into the Feature Policy: A way of saying things like “I disallow document.write on my site, and any third parties in any iframes that get loaded”. More advanced concepts like static layouting and safe prerendering require far-fetching changes to the web platform, but hey – just like forward time travel, it’s not impossible, just very, very difficult 🙂
Join me in figuring this out by getting in touch on Twitter or Slack, and know that I’ll always have an open ear for your questions, ideas and concerns. Onwards!
Posted by Paul Bakaus, AMP Developer Advocate, Google
In 2017, we’ve predicted that more than half of the traffic to Cloudflare’s network will come from mobile devices. Even if they are formatted to be displayed on a small screen, the mobile web is built on traditional web protocols and technologies that were designed for desktop CPUs, network connections, and displays. As a result, browsing the mobile web feels sluggish compared with using native mobile apps.
In October 2015, the team at Google announced Accelerated Mobile Pages (AMP), a new, open technology to make the mobile web as fast as native apps. Since then, a large number of publishers have adopted AMP. Today, 600 million pages across 700,000 different domains are available in the AMP format.
The majority of traffic to this AMP content comes from people running searches on Google.com. If a visitor finds content through some source other than a Google search, even if the content can be served from AMP, it typically won’t be. As a result, the mobile web continues to be slower than it needs to be.
Making the Mobile Web App-Quick
Cloudflare’s Accelerated Mobile Links helps solve this problem, making content, regardless of how it’s discovered, app-quick. Once enabled, Accelerated Mobile Links automatically identifies links on a Cloudflare customer’s site to content with an AMP version available. If a link is clicked from a mobile device, the AMP content will be loaded nearly instantly.
To see how it works, try viewing this post from your mobile device and clicking any of these links:
One of the benefits of Accelerated Mobile Links is that AMP content is loaded in a viewer directly on the site that linked to the content. As a result, when a reader is done consuming the AMP content closing the viewer returns them to the original source of the link. In that way, every Cloudflare customers’ site can be more like a native mobile app, with the corresponding increase in user engagement.
For large publishers that want an even more branded experience, Cloudflare will offer the ability to customize the domain of the viewer to match the publisher’s domain. This, for the first time, provides a seamless experience where AMP content can be consumed without having to send visitors to a Google owned domain. If you’re a large publisher interested in customizing the Accelerated Mobile Links viewer, you can contact Cloudflare’s team.
Innovating on AMP
While Google was the initial champion of AMP, the technologies involved are open. We worked closely with the Google team in developing Cloudflare’s Accelerated Mobile Links as well as our own AMP cache. Malte Ubl, the technical lead for the AMP Project at Google said of our collaboration:
“Working with Cloudflare on its AMP caching solution was as seamless as open-source development can be. Cloudflare has become a regular contributor on the project and made the code base better for all users of AMP. It is always a big step for a software project to go from supporting specific caches to many, and it is awesome to see Cloudflare’s elegant solution for this.”
Cloudflare now powers the only compliant non-Google AMP cache with all the same performance and security benefits as Google.
In the spirit of open source, we’re working to help develop updates to the project to address some of publishers’ and end users’ concerns. Specifically, here are some features we’re developing to address concerns that have been expressed about AMP:
Easier ways to share AMP content using publisher’s original domains
Automatically redirecting desktop visitors from the AMP version back to the original version of the content
A way for end users who would prefer not to be redirected to the AMP version of content to opt out
The ability for publishers to brand the AMP viewer and serve it from their own domain
Cloudflare is committed to the AMP project. Accelerated Mobile Links is the first AMP feature we’re releasing, but we’ll be doing more over the months to come. As of today, Accelerated Mobile Links is available to all Cloudflare customers for free. You can enable it in your Cloudflare Performance dashboard. Stay tuned for more AMP features that will continue to increase the speed of the mobile web.
Editor’s Note: This blog post was amended to remove the term “AMP Lite”. This was a code name for a project to make AMP better for slow networks but many readers interpreted this as a separate version of AMP. We apologize for the confusion.
At Google we believe in designing products with speed as a core principle. The Accelerated Mobile Pages (AMP) format helps ensure that content reliably loads fast, but we can do even better.
Smart caching is one of the key ingredients in the near instant AMP experiences users get in products like Google Search and Google News & Weather. With caching, we can make content be, in general, physically closer to the users who are requesting it so that bytes take a shorter trip over the wire to reach the user. In addition, using a single common infrastructure like a cache provides greater consistency in page serving times despite the content originating from many hosts, which might have very different—and much larger—latency in serving the content as compared to the cache.
Faster and more consistent delivery are the major reasons why pages served in Google Search’s AMP experience come from the Google AMP Cache. The Cache’s unified content serving infrastructure opens up the exciting possibility to build optimizations that scale to improve the experience across hundreds of millions of documents served. Making it so that any document would be able to take advantage of these benefits is one of the main reasons the Google AMP Cache is available for free to anyone to use.
In this post, we’ll highlight two improvements we’ve recently introduced: (1) optimized image delivery and (2) enabling content to be served more successfully in bandwidth-constrained conditions.
Image optimizations by the Google AMP Cache
On average across the web, images make up 64% of the bytes of a page. This means images are a very promising target for impactful optimizations.
Applying image optimizations is an effective way for cutting bytes on the wire. The Google AMP Cache employs the image optimization stack used by the PageSpeed Modules and Chrome Data Compression. (Note that in order to make the above transformations, the Google AMP Cache disregards the “Cache-Control: no-transform” header.) Sites can get the same image optimizations on their origin by installing PageSpeed on their server.
Here’s a rundown of some of the optimizations we’ve made:
1) Removing data which is invisible or difficult to see
We remove image data that is invisible to users, such as thumbnail and geolocation metadata. For JPEG images, we also reduce quality and color samples if they are higher than necessary. To be exact, we reduce JPEG quality to 85 and color samples to 4:2:0 — i.e., one color sample per four pixels. Compressing a JPEG to quality higher than this or with more color samples takes more bytes, but the visual difference is difficult to notice.
The reduced image data is then exhaustively compressed. We’ve found that these optimizations reduce bytes by 40%+ while not being noticeable to the user’s eye.
2) Converting images to WebP format
Some image formats are more mobile-friendly. We convert JPEG to WebP for supported browsers. This transformation leads to an additional 25%+ reduction in bytes with no loss in quality.
3) Adding srcset
We add “srcset” if it has not been included. This applies to “amp-img” tags with “src” but no “srcset” attribute. The operation includes expanding “amp-img” tag as well as resizing the image to multiple dimensions. This reduces the byte count further on devices with small screens.
4) Using lower quality images under some circumstances
We decrease the quality of JPEG images when there is an indication that this is desired by the user or for very slow network conditions (as discussed below). For example, we reduce JPEG image quality to 50 for Chrome users who have turned on Data Saver. This transformation leads to another 40%+ byte reduction to JPEG images.
The following example shows the images before (left/top) and after (right/bottom) optimizations. Originally the image has 241,260 bytes, and after applying Optimizations 1, 2, & 4 it becomes 25,760 bytes. After the optimizations the image looks essentially the same, but 89% of the bytes have been saved.
AMP for Slow Network Conditions
Many people around the world access the internet with slow connection speeds or on devices with low RAM and we’ve found that some AMP pages are not optimized for these severely bandwidth constrained users. For this reason, Google has also started an effort to remove even more bytes from AMP pages for these users.
With this initiative, we apply all of the above optimizations to images. In particular, we always use lower quality levels (see Bullet 4 above).
In addition, we optimize external fonts by using the amp-font tag, setting the font loading timeout to 0 seconds so pages can be displayed immediately regardless of whether the external font was previously cached or not.
We will be rolling out these optimizations for bandwidth-constrained users in several countries such as Vietnam and Malaysia and for holders of low ram devices globally. Note that these optimizations may modify the fine details of some images, but do not affect other parts of the page including ads.
* * *
All told, we see a combined 45% reduction in bytes across all optimizations listed above.
We hope to go even further in making more efficient use of users’ data to provide even faster AMP experiences.
With the arrival of 2017, it’s time to review some launches from Q4 and projects started over the past few months that will continue into the new year. We’ve updated the AMP Roadmap to provide a more detailed reflection of what’s summarized below.