💻️ Reverse Engineering Ticketmaster

And how Uber scaled.

Hey everybody.

Today I share one of the hottest ethical dilemmas that has surfaced with the rise of AI agents: should chatbots be classed as crawlers? Also, at the end of today’s edition, there is a great blog post where Conduition reverse-engineers TicketMaster’s online ticket system.

AI

Are AI Chatbots humans or crawlers?
Perplexity, a popular AI platform aiming to compete with Google in the search market, is the center of a legal drama right now. The platform allegedly repurposed a Forbes investigative article on its own platform.

This finding from Forbes has led to accusations of copyright infringement. Now, there are even more reports. This time it’s alleged that AWS, Perplexity's web host, is conducting an investigation into Perplexity’s behaviour surrounding their lack of usage of robots.txt, which is the standard governing web crawler behavior.

Wired is suggesting that Perplexity's chatbot sometimes sneakily bypasses robots.txt, which raises ethical questions about respecting publishers' wishes. The heart of the issue is essentially in defining what constitutes a "crawler" under robots.txt. Perplexity is arguing that their system doesn't constitute as crawling behavior, and that it is no different to a user manually searching a URL.

Robots.txt emerged in the 90s as a way to prevent web server overloads from search engine crawlers. This informal standard became the Robots Exclusion Protocol by the IETF.

As technology has advanced, there are now various automated systems which are accessing web content for a huge variety of reasons. And of course, this creates challenges when trying to prescribe them to traditional definitions. As AI agents continue to rapidly evolve, the ethical, economic, technical, and legal landscape will shift, which will ultimately require updated standards and practices.

Another great blog post on the topic was written by Robb Knight, who conducted his own investigation into how Perplexity scrapes data without users knowing.

Quick Links

đźš• Brief History of Scaling Uber
Today, Uber is the largest mobility platform, operating in over 70 countries and 10,500 cities. Uber Eats is the largest food delivery service outside China, spanning 45 countries. But how did Uber scale its tech foundations from scratch?

💻️ Hacker Stole Secrets From OpenAI
OpenAI experienced an undisclosed breach in early 2023. The attacker was able to access an employee forum, but they didn't reach systems building the AI. OpenAI didn't inform the FBI, citing no stolen customer or partner info and no national security threat. The incident has still sparked internal security concerns.

📹️ YouTube Embeds are Bananas Heavy and it’s Fixable
YouTube embeds are hefty at 1.3MB each and don't share resources. Switch to the <lite-youtube> Web Component—it’s just 100k, shares resources, and keeps all functionality intact. This custom element renders just like the real thing but approximately 224× faster.

🤔 Should interfaces be asynchronous?
Async and await are notoriously contagious. Should all interfaces be Task-based just in case? This seemingly simple question is fundamental and has deeper implications than you might realize.

GITHUB COPILOT

Demystify history with GitHub Copilot commit explanations
We’ve all been in the situation where we struggle to understand a commit's purpose or context. Well, you can use GitHub Copilot to help with that.

Copilot can now generate commit explanations, analyzing code differences and producing concise summaries. This saves time guessing and deciphering your Git history and allows for improved communication and teamwork through better documentation and transparency.

There’s no doubt that git histories can be daunting, but it’s critical for learning about a codebase or identifying bugs. With the GitHub Copilot-powered explain feature in the Commit Details window, you can generate a summary of changes alongside the code, which highlights key differences and the rationale behind them. This can sometimes take a while for large changesets or pull requests.

So how do you do this? Double-click any commit to open the Commit Details pane in the Git Repository window, then click the Explain button above the commit message for a summary of changes. If you’d like a better look at code changes and their descriptions, you can click the expand option and summary view.

REVERSE ENGINEERING

Conduition shared in a recent blog post how they successfully reverse-engineered TicketMaster's ticket system. TicketMaster now requires mobile entry to its events, known as SafeTix, which features rotating barcodes displayed on their web or mobile apps.

However, these barcodes are flawed. Of course, firstly you have the technology illiterate who will struggle to use a mobile app, and then there is a whole host of other issues including your mobile running out of charge or breaking.

SafeTix is designed to make reselling tickets outside TicketMaster's platform impossible, which means TicketMaster profits from their high-margin resale marketplace. It also forces users to install TicketMaster’s app, giving them more access to user data. This system prevents saving and transferring tickets outside the app.

In this blog post, Conduition figured out how to save and transfer tickets offline, essentially bypassing TicketMaster's restrictions.

They were able to extract the base64 token property from the TicketMaster API, therefore being able to duplicate the barcodes and could theoretically resell tickets outside of TicketMaster's marketplace. With a valid rawToken, eventKey, and customerKey, one can generate legitimate barcodes, making it impossible for venue staff to distinguish between the official app and these duplicates.

I highly recommend you read the full blog post, where Conduition breaks down exactly how they used simple methods, such as digging through Chrome web tools, to figure out the system employed by a multi-billion dollar company.

Until next week,

Travis.

Reply

or to participate.