foodpanda is a Berlin-based startup, and we run several online food ordering platforms worldwide, including brands like foodpanda, hellofood, Donesi, Pauza, NetPincer, Otlob, HungerStation and 24h.ae.
Headquartered in Berlin, we have offices in 22 countries, including: Singapore, Hong Kong, Malaysia, India, Thailand, Pakistan, Hungary, Croatia, Serbia, Saudi Arabia, Egypt and Taiwan. Enabling restaurants to become more visible online and on mobile, insodoing increasing their demand. Foodpanda offers users the most convenient food ordering system and the widest culinary range to order from online or via app.
On the hunt for a tool to help us monitor the SEO health of all foodpanda’s platforms; we found DeepCrawl. Even though foodpanda has quite a big SEO team, monitoring 24 platforms is challenging at times, especially since the IT team implements changes in the frontend at least twice a week.
We needed a tool that could help identify critical issues…and fast! Somehow, it wasn’t until the migration of the Singaporean platform foodpanda.sg that we fully realized the potential of DeepCrawl.
On their way to becoming the leading food delivery platform in Singapore, foodpanda decided to migrate the platform to a newer structure in January 2016. We all know that a website migration is potentially one of the most dangerous and tricky SEO operations, a process that fundamentally requires the utmost care to avoid tragedies like significant traffic loss/lower rankings.
So as not to lose traffic – in this case users and their orders – we had to carefully plan the migration of the old platform into the new one. To make the transition as smooth as possible for users and search engines alike, we decided to migrate the platform only partially at first, showing the new platform only to select users. Specifically, to active users on the website, who didn’t have the mobile app. Foodpanda started with the first 5% – then rolled it out the next 20% – and finally released the new version of the whole platform.
One of the main challenges was to switch selected users while they were logged in on the website. Most crawling tools do not support cookies and that is why we needed a solution that they could test with real users, but without showing it to Googlebot or other users. Not only did we want to test the urls but the effectiveness of the whole process: restaurant integration, ordering process, payment and confirmation. The only solution was to make the new front-end visible only for DeepCrawl but not to Googlebot. But how?
We used the DeepCrawl IP options (in the Advanced Settings), and set it to a static IP address in order to support cookies. Using the report overview DeepCrawl showed data from before and after the tests, playing a key role in comparing the old and new sites, so we could easily evaluate the new site before going live.
Once the migration process started, DeepCrawl immediately found our on-site errors and SEO issues at first glance. One of them was pagination. DeepCrawl’s pagination report showed an excess in crawling budget for content that was unreachable and useless for users. Every site should aim to only show users relevant content for the best possible user experience. Similarly, to ensure crawl budget isn’t wasted, so search engines crawl relevant/high-value pages.
Busy SEO’s often make the mistake of solely focusing on pages that bring conversions, sometimes overlooking service pages. DeepCrawl helped us notice errors in the service pages, where relevant content on these pages was not properly integrated during the migration (duplicate titles, meta descriptions, html errors and more).
Another important function in the migration process was the fact that DeepCrawl allowed us to work and play around with test pages, including pages requiring site authentication. This was especially useful for our migration project because we were able to compare the live and the staging sites within the same project. We basically monitored everything using this function, combined with the static IP address selection, so that the “before and after” of the same user landing page could be shown. In other words, the right function at the right time.
OUTCOMES AND IMPACT
Thanks to all the testing possibilities and scenarios available within DeepCrawl, the site migration was smooth, and we felt confident that traffic and transactions wouldn’t be affected after the relaunch. In fact, the migration was hugely beneficial to our business, as you can see from the analytics screenshot below. We managed to maintain our audience AND gain new users, AND maximize their transactions!
There are 3 main takeaways that we learnt from using DeepCrawl for a site migration:
1 – Testing is fundamental: thanks to the highly customizable options within DeepCrawl, we were able to personalize tests for a large amount of potential cases. That meant we knew in advance that they weren’t running any risks when going live with the new pages, as all the processes and scenarios were tested and mapped out ahead of time.
2 – Comparing data is key: being able to compare data, also over a substantial time span, is a great advantage when testing new pages and comparing them with older versions.
3 – Granularity and visualization of data: DeepCrawl’s presentation of data and its granularity helped us immediately visualize all issues – at any level – and to report to all involved departments without having to create graphs or tables.
The testing capabilities, comparison crawls, granular data & design were definitely huge time savers!