What Cloud Marketplaces Do and Don’t Do
Not long ago, we observed here in our blog that the critical insights that drive business value come from data that is both (1) fast and (2) reliable.
How do you get white-glove customer service from a major data supplier?
Perhaps, if you’re a Fortune 500 financial institution, you do it by picking up the phone or sending an email. But, then, if you’re a Fortune 500 financial institution, you probably have a large market data team who are consistently driving a close relationship with their data supplier—and who can access internal data engineering resources to address a lot of issues with external datasets.
Otherwise, it’s a trickier proposition.
Large data suppliers have customers all over the world of all shapes and sizes. Small institutions often lack the capacity or the sway to get the same kind of relatively speedy, hands-on resolution from a data vendor that a well-resourced Fortune 500 might. When there’s a problem with a data pipeline, a common pain point for small hedge funds and small broker-dealers is timely responsiveness.
Smaller institutions have, by definition, smaller teams—teams that lack the capacity to consistently drive a close relationship with data suppliers the way that a large enterprise might, but that, paradoxically, may need the help far more than a large enterprise would.
At the end of the day, small firms have smaller teams and fewer resources. But that doesn’t make their needs or their ambitions any smaller.
The external-data tragedy
In particular, smaller firms typically recognize that it's much better to have as many eyes and hands on data ingestion as possible—and that doing that effectively internally is challenging. It usually means scaling up in an unfeasible or unsustainable way or—more commonly—bringing people over from a research, IT, or data-science team to handle data-engineering tasks. (And in small firms, those teams may be exceptionally small—sometimes as small as one person.)
The end result is that these data-ingestion needs take people away from the work that (1) they were hired to do and (2) actually drives the business. Or, if the firm can manage to hire full-time data engineers, those engineers wind up spending an average of 70% of their time on prepping and maintaining data for analytics—instead of on the analytics itself.
Moreover, because these small data-engineering teams quickly get worked to capacity, shortcuts may happen; data engineers may find themselves perversely incentivized or compelled to toss messy, unstructured data to the researchers—who typically have but limited Python skills with which to try to jury-rig the data into something resembling usability.
And if the data-ingestion and data-integration work doesn’t get done at all? It’s not uncommon for financial firms—of all sizes—to find themselves in a situation where they have been paying for data for months but they still don’t have the data—because they have not yet been able to onboard it. Consequently, this phantom data goes forgotten by the research teams.
Onboarding external data is necessary, but it is repetitive, menial work—hardly a hedge fund’s “special sauce.” The tragedy of it is that the skilled, talented specialists at small firms who wind up dealing with data ingestion don’t instead get to build something big and exciting that propels the business forward—work that they are fully capable of doing and that they would probably much prefer to do.
To wit, the data-ingestion problem is why small firms stay small.
Don’t leave growth on the table
Three questions for small financial firms, then, naturally follow:
What percentage of growth could you create if you didn’t have to put out data-ingestion fires?
What percentage of growth could you create if you could free yourself of the burden of worrying about the cleanliness, accessibility, usability, timeliness, or risks of external data?
What percentage of growth could you create if you could free your talented people to commit themselves to the work they joined your organization to do—the work they excel at?
These questions are of profitable concern for large financial-service enterprises. For small financial firms, however, these questions are existential. The financial-services sector is highly competitive, with firms of all sizes exploiting any data-based edge they can find. At some point, if DataOps can’t scale, they slide.
This is why financial institutions of all sizes trust Crux with their data-integration needs. Crux transforms external data to analytics-ready data customized to your internal needs.
We’ve made the case before that the external-data integration question must always be settled with “Buy” over “Build.” Organizations that try to in-house their external-data operations eventually find themselves doomed to perpetual capacity issues. Even organizations with a well-scaled DataOps team have plenty of room for improvement when it comes to speed, reliability, and efficiency (that 70% figure cited above applies to large firms just the same as it does to small firms).
Outsourcing to a managed-services provider that specializes in external-data integration is the only feasible, scalable, cost-effective, and growth-oriented solution. And there is no other company in the market that solves the external-data problem in the holistic way that Crux does—or as prolifically as Crux does. To date, Crux has built and optimized more than 60,000 pipelines from hundreds of data suppliers.
The upshot is that Crux offers an additional important value-add to its customers: Our relationships with suppliers. Day after day, our 24/7 data-engineering teams work closely with data suppliers to constantly ensure that our customers and their downstream users have their data when they need it, where they need it, and how they need it.
The result? A white-glove customer experience for external data.
Just ask us about it.
Not long ago, we observed here in our blog that the critical insights that drive business value come from data that is both (1) fast and (2) reliable.
This past year has been exciting, representing the dawning of a new age for artificial intelligence (AI) and machine learning (ML)—with large...
How do you get white-glove customer service from a major data supplier?