Hacker News
Cloudflare's AI Platform: an inference layer designed for agents
james2doyle
|next
|previous
[-]
Yes, you can see the same "hosted" ones on there, but when you look at the models endpoint, there are much less options at the "workers-ai/*" namespace. Is that intentional?
whereistejas
|next
|previous
[-]
mikeocool
|root
|parent
|next
[-]
But in practice, it's basically impossible to use that way in conjunctions with workers, since you have to bind every database you want to use to the worker and binding a new database requires redeploying the worker.
eis
|root
|parent
|next
|previous
[-]
But even without network issues that have plagued it I would hesitate to build anything for production on it because it can't even do transactions and the product manager for D1 openly stated they wont implement them [0]. Your only way to ensure data consistency is to use a Durable Object which comes with its own costs and tradeoffs.
https://github.com/cloudflare/workers-sdk/issues/2733#issuec...
The basic idea of D1 is great. I just don't trust the implementation.
For a hobby project it's a neat product for sure.
rs_rs_rs_rs_rs
|root
|parent
|next
|previous
[-]
kylehotchkiss
|root
|parent
|next
|previous
[-]
BoorishBears
|root
|parent
|previous
[-]
Cloudflare seems to be building for lock-in and I don't love it. I especially don't understand how you build an OpenRouter and only have bindings for your custom runtime at launch.
bm-rf
|next
|previous
[-]
[1] https://developers.cloudflare.com/ai/models/
[2] https://developers.cloudflare.com/ai-gateway/features/unifie...
samjs
|root
|parent
|next
[-]
We'll be adding prices to the docs and the model catalog in the dashboard shortly.
In short: currently the pricing matches whatever the provider charges. You can buy unified billing credits [1] which charges a small processing fee.
> Finally, would be great if this could return OpenAI AND Anthropic style completions.
Agreed! This will be coming shortly. Currently we'll match the provider themselves, but we plan to make it possible to specify an API format when using LLMs.
[1]: https://developers.cloudflare.com/ai-gateway/features/unifie...
yoavm
|root
|parent
|previous
[-]
ramesh31
|next
|previous
[-]
pprotas
|next
|previous
[-]
yoavm
|root
|parent
[-]
indigodaddy
|root
|parent
[-]
throwpoaster
|next
|previous
[-]
6thbit
|next
|previous
[-]
rant aside, they are greatly positioned network wise to offer this service, i wonder about their princing and potential markup on top of token usage?
i presume they wont let you “manage all your AI spend in one place” for free.
koolba
|root
|parent
[-]
Of course they will. In return they get to control who they’re routing requests to. I wouldn’t be surprised if this turns I to the LLM equivalent of “paying for order flow”.
wahnfrieden
|next
|previous
[-]
edit: Why downvote? It's correct, and it's a risk that competitors handle better, including for their CDN products (compared to Bunny CDN). Maybe you are just used to the risk and haven't felt the burn yourself yet. Or you have the mistaken notion that there is no price at which temporary downtime is worthwhile to avoid paying.
mbtrucks
|next
|previous
[-]
mbtrucks
|next
|previous
[-]
stult
|previous
[-]
I immediately pulled all my sites off of Cloudflare and I will never use that godawful nightmare of a company for anything ever again. If they can't even host a generic help bot without screwing it up that badly, why would I ever use them for anything at all, never mind an AI platform?