New top story on Hacker News: Show HN: Ingest data from your customers (Prequel YC W21)

Show HN: Ingest data from your customers (Prequel YC W21)
22 by ctc24 | 1 comments on Hacker News.
Hey HN! Charles here from Prequel (https://prequel.co). We just launched the ability for companies to import data from their customer’s data warehouse or database, and we wanted to share a little bit more about it with the community. If you just want to see how it works, here’s a demo of the product that Conor recorded: https://ift.tt/n9fHEpA. Quick background on us: we help companies integrate with their customer’s data warehouse or database. We’ve been busy helping companies export data to their customers – we’re currently syncing over 40bn rows per month on behalf of companies. But folks kept on asking us if we could help them import data from their customers too. They wanted the ability to offer a 1st-party reverse ETL to their customers, similar to the 1st-party ETL capability we already helped them offer. So we built that product, and here we are. Why would people want to import data? There are actually plenty of use-cases here. Imagine a usage-based billing company that needs to get a daily pull from its customers of all the billing events that happened, so that they can generate relevant invoices. Or a fraud detection company who needs to get the latest transaction data from its customers so it can appropriately mark fraudulent ones. There’s no great way to import customer data currently. Typically, people solve this one of two ways today. One is they import data via CSV. This works well enough, but it requires ongoing work on the part of the customer: they need to put a CSV together, and upload it to the right place on a daily/weekly/monthly basis. This is painful and time-consuming, especially for data that needs to be continuously imported. Another one is companies make the customer write custom code to feed data to their API. This requires the customer to do a bunch of solutions engineering work just to get started using the product – which is a suboptimal onboarding experience. So instead, we let the customer connect their database or data warehouse and we pull data directly from there, on an ongoing basis. They select which tables to import (and potentially map some columns to required fields), and that’s it. The setup only takes 5 minutes, and requires no ongoing work. We feel like that’s the kind of experience every company should provide when onboarding a new customer. Importing all this data continuously is non-trivial, but thankfully we can actually reuse 95% of the infrastructure we built for data exports. It turns out our core transfer logic remains pretty much exactly the same, and all we had to do was ship new CRUD endpoints in our API layer to let users configure their source/destination. As a brief reminder about our stack, we run a GoLang backend and Typescript/React frontend on k8s. In terms of technical design, the most challenging decisions we have to make are around making database’s type-systems play nicely with each other (kind of an evergreen problem really). For imports, we allow the data recipient to specify whether they want to receive this data as JSON blob, or as a nicely typed table. If they choose the latter, they specify exactly which columns they’re expecting, as well as what type guarantees those should uphold. We’re also working on the ability to feed that data directly into an API endpoint, and adding post-ingestion validation logic. We’ve mentioned this before but it bears worth repeating. We know that security and privacy are paramount here. We're SOC 2 Type II certified, and we go through annual white-box pentests to make sure that all our code is up to snuff. We never store any of the data anywhere on our servers. Finally, we offer on-prem deployments, so data never even has to touch our servers if our customers don't want it to. We’re really stoked to be sharing this with the community. We’ll be hanging out here for most of the day, but you can also reach us at hn (at) prequel.co if you have any questions!

Comments

Popular posts from this blog