Most people familiar with Copilot technology know it as a powerful UI where Xaxis traders can configure algorithmic strategies, manage IOs, view programmatic performance data, and click around to interact in a cool, albeit manual way. There are, however, individuals and Xaxis teams around the globe who need more automation and power than a UI can grant them. For this, we've designed the Copilot Developer API.
The goal of the Copilot API is to give engineering-minded Xaxians automated access to a subset of the backend functionality that we in the Copilot engineering department avail ourselves of every day. We ultimately want to make Copilot a more open tool for experimentation and this API build is a big step in that direction for our team.
The build
Building an API seemed simple enough given the Copilot integrations already in place; we were just creating our own internal REST API (for non-nerds: the same way the internet communicates). The real work took some “human intelligence” beyond just technical expertise. Opening up an API to a Demand Side Platform (DSP) in the AdTech space means media strategies can be changed, and real advertiser money can be spent. The bigger technical challenges we had to consider in light of all this were user authorization and how frequently these users can make requests.
User authorization is very important — again, you can actually change real live bidding models in a DSP with an API, so we needed to be smart about who can access the tool. To enable this in a safe way, we connected with local Xaxis managers and had them identify a select group of test/beta users. In our UI, these managers can enable users by generating a token that, when sent in REST request message headers, will allow access to our developer API. We closely track these tokens and requests made by each one to make sure we can properly debug any issues that might arise.
Additionally, to protect our system's integrity, we implemented rate limiting. Users are currently allowed 20 requests per minute. This is to make sure that someone with a runaway script doesn't, say, try to update every custom tree they have access to repeatedly, as well as ensure that our servers don't get unnecessarily overburdened. (Note: our servers are “auto-scaling” meaning they will respond to sudden increased demand, but we view too many requests sent simultaneously as more likely to be an error than to be intended behavior).
How this currently enables users
The final challenge was deciding what Copilot functionality to expose with our first iteration of the API. Our initial endpoints were decided based on discussions with Copilot users about their most time-consuming tasks. Currently, the API provides the following:
- Generate and download log level reports on our clustering algorithmic model. Our clustering strategy optimizes to multiple media KPIs at once by splitting bid decisions into small pockets of performance and incrementally adjusting bid price up or down for the well performing and poor performing pockets. These reports help explain what the Copilot Strategy did, where it split, and the performance benefits of using it.
- Directly update bidding models in Xandr. One of our Copilot Strategies is a portal that allows users to create their own custom models (either in csv or Bonsai text form). The API connection makes this task more seamless so local teams can, say, run their models daily, output a csv, and automatically update the DSP with their new calculations.
- Easily query information about existing DSP objects and the Strategies they are associated with. Reporting access combined with the easy Strategy editing allows for experimental users to automate their modeling pipeline and activate in the DSPs.
Future use cases for the API might include granting more access to our raw data, enabling our DSP API for more advanced use cases, uploading custom outcome values, generating visualizations on demand, etc, etc… It really depends on what you want to do with Copilot.