TOPINDIATOURS Hot ai: Railway secures $100 million to challenge AWS with AI-native cloud i

πŸ“Œ TOPINDIATOURS Hot ai: Railway secures $100 million to challenge AWS with AI-nati

Railway, a San Francisco-based cloud platform that has quietly amassed two million developers without spending a dollar on marketing, announced Thursday that it raised $100 million in a Series B funding round, as surging demand for artificial intelligence applications exposes the limitations of legacy cloud infrastructure.

TQ Ventures led the round, with participation from FPV Ventures, Redpoint, and Unusual Ventures. The investment values Railway as one of the most significant infrastructure startups to emerge during the AI boom, capitalizing on developer frustration with the complexity and cost of traditional platforms like Amazon Web Services and Google Cloud.

"As AI models get better at writing code, more and more people are asking the age-old question: where, and how, do I run my applications?" said Jake Cooper, Railway's 28-year-old founder and chief executive, in an exclusive interview with VentureBeat. "The last generation of cloud primitives were slow and outdated, and now with AI moving everything faster, teams simply can't keep up."

The funding is a dramatic acceleration for a company that has charted an unconventional path through the cloud computing industry. Railway raised just $24 million in total before this round, including a $20 million Series A from Redpoint in 2022. The company now processes more than 10 million deployments monthly and handles over one trillion requests through its edge network β€” metrics that rival far larger and better-funded competitors.

Why three-minute deploy times have become unacceptable in the age of AI coding assistants

Railway's pitch rests on a simple observation: the tools developers use to deploy and manage software were designed for a slower era. A standard build-and-deploy cycle using Terraform, the industry-standard infrastructure tool, takes two to three minutes. That delay, once tolerable, has become a critical bottleneck as AI coding assistants like Claude, ChatGPT, and Cursor can generate working code in seconds.

"When godly intelligence is on tap and can solve any problem in three seconds, those amalgamations of systems become bottlenecks," Cooper told VentureBeat. "What was really cool for humans to deploy in 10 seconds or less is now table stakes for agents."

The company claims its platform delivers deployments in under one second β€” fast enough to keep pace with AI-generated code. Customers report a tenfold increase in developer velocity and up to 65 percent cost savings compared to traditional cloud providers.

These numbers come directly from enterprise clients, not internal benchmarks. Daniel Lobaton, chief technology officer at G2X, a platform serving 100,000 federal contractors, measured deployment speed improvements of seven times faster and an 87 percent cost reduction after migrating to Railway. His infrastructure bill dropped from $15,000 per month to approximately $1,000.

"The work that used to take me a week on our previous infrastructure, I can do in Railway in like a day," Lobaton said. "If I want to spin up a new service and test different architectures, it would take so long on our old setup. In Railway I can launch six services in two minutes."

Inside the controversial decision to abandon Google Cloud and build data centers from scratch

What distinguishes Railway from competitors like Render and Fly.io is the depth of its vertical integration. In 2024, the company made the unusual decision to abandon Google Cloud entirely and build its own data centers, a move that echoes the famous Alan Kay maxim: "People who are really serious about software should make their own hardware."

"We wanted to design hardware in a way where we could build a differentiated experience," Cooper said. "Having full control over the network, compute, and storage layers lets us do really fast build and deploy loops, the kind that allows us to move at 'agentic speed' while staying 100 percent the smoothest ride in town."

The approach paid dividends during recent widespread outages that affected major cloud providers β€” Railway remained online throughout.

This soup-to-nuts control enables pricing that undercuts the hyperscalers by roughly 50 percent and newer cloud startups by three to four times. Railway charges by the second for actual compute usage: $0.00000386 per gigabyte-second of memory, $0.00000772 per vCPU-second, and $0.00000006 per gigabyte-second of storage. There are no charges for idle virtual machines β€” a stark contrast to the traditional cloud model where customers pay for provisioned capacity whether they use it or not.

"The conventional wisdom is that the big guys have economies of scale to offer better pricing," Cooper noted. "But when they're charging for VMs that usually sit idle in the cloud, and we've purpose-built everything to fit much more density on these machines, you have a big opportunity."

How 30 employees built a platform generating tens of millions in annual revenue

Railway has achieved its scale with a team of just 30 employees generating tens of millions in annual revenue β€” a ratio of revenue per employee that would be exceptional even for established software companies. The company grew revenue 3.5 times last year and continues to expand at 15 percent month-over-month.

Cooper emphasized that the fundraise was strategic rather than necessary. "We're default alive; there's no reason for us to raise money," he said. "We raised because we see a massive opportunity to accelerate, not because we needed to survive."

The company hired its first salesperson only last year and employs just two solutions engineers. Nearly all of Railway's two million users discovered the platform through word of mouth β€” developers telling other developers about a tool that actually works.

"We basically did the standard engineering thing: if you build it, they will come," Cooper recalled. "And to some degree, they came."

From side projects to Fortune 500 deployments: Railway's unlikely corporate expansion

Despite its grassroots developer community, Railway has made significant inroads into large organizations. The company claims that 31 percent of Fortune 500 companies now use its platform, though deployments range from company-wide infrastructure to individual team projects.

Notable customers include Bilt, the loyalty program company; Intuit's GoCo subsidiary; TripAdvisor's Cruise Critic; and MGM Resorts. Kernel, a Y Combinator-backed startup providing AI infrastructure to over 1,000 companies, runs its entire customer-facing system on Railway for $444 per month.

"At my previous company Clever, which sold …

Konten dipersingkat otomatis.

πŸ”— Sumber: venturebeat.com


πŸ“Œ TOPINDIATOURS Breaking ai: Sam Altman in Damage Control Mode as ChatGPT Users Ar

OpenAI just handed one of its biggest rivals a massive PR victory, in a blunder that even CEO Sam Altman admitted had optics that “don’t look good.”

On Friday, Altman announced that OpenAI had reached a new agreement with the Department of Defense over how its AI systems would be deployed across the military, an act that many saw as the company crossing the picket line. That’s because Anthropic, a company founded by former OpenAI employees, had refused to give in to the Pentagon’s demands that it give the military unrestricted use of its Claude AI, even as CEO Dario Amodei insisted that Anthropic’s AI not be used for autonomous weaponry or the mass surveillance of US citizens.

It was a move that could come at great cost for Anthropic. The Pentagon had vowed to ice the company out of contracts with the federal government by declaring it a “supply chain risk,” and even threatened to seize its tech.

But at least in the short term, it’s OpenAI that’s facing more blowback for its decision. Online, scores of users β€” ranging from your typical AI bro to, we kid you not, Katy Perry β€” are saying they’re ditching ChatGPT in favor of Claude because of Altman’s deal with the Pentagon. Indeed, Claude surged to the top of the App Store over the weekend, and as of Monday, still claims the number one spot above ChatGPT, which is currently in second place.

A recent thread in the r/ChatGPT subreddit calling on users to to quit the AI chatbot quickly became one of the forum’s most highly-upvoted posts of all time.

“You’re now training a war machine,” the thread reads. “Let’s see proof of cancellation.”

The fierce backlash is despite Altman claiming that the DoD agreement included the same restrictions that Anthropic had been seeking. But in the eyes of many of its users and critics alike, the fact that OpenAI had reached an agreement at all while Anthropic refused to bend the knee was a sign of its capitulation to a deeply unpopular administration. The ethics of a company that was founded on supposedly beneficent principles now allowing its AI systems to be deployed across the US military couldn’t have faced a more immediate test when just hours after Altman announced the agreement on Friday, the US and Israel launched a series of deadly strikes in Iran that killed its leader Ruhollah Khomeini and hundreds of civilians. (Reports suggest that the DoD used Claude to select targets in Iran, meaning even Anthropic’s principled stand may be yet more theater from the AI industry.)

Altman, meanwhile, has been in damage control. Following the deal’s announcement, he hosted a rare AMA on X where he fielded questions about OpenAI’s work with the “DoW” β€” referring to the “Department of War,” the Trump administration’s preferred moniker for the DoD β€” and respondents didn’t hold back.

“How did you go from ‘a tool for the betterment of the human race’ to ‘let’s work with the department of WAR’?” asked one user. Another mocked Altman by asking if he was happy that Claude overtook ChatGPT on the App Store. “No,” Altman conceded.

One of the most pressing questions concerned what OpenAI would do if the DoD issued orders that violated the constitution, or sought to carry out mass domestic surveillance. Altman’s line was that OpenAI would refuse any such orders, even if it meant imprisonment. (“Please come visit me in jail if necessary,” he quipped.)

But he also exhibited a blind faith that this would never be an issue by, more or less, extolling the virtues of the armed forces. Altman asserted that the “people in our military are far more committed to the constitution than an average person off the streets,” and uncritically cited a statement from a DoD official who vowed that it would never infringe on American’s civil liberties or engage in “unlawful” surveillance.

Such pinky-promises from Trump administration figures were apparently enough to convince Altman that everything the military did or has ever done is entirely above board, to overlook the fact that the administration has leaned on cutting edge surveillance tech to carry out mass deportations, and to memory-hole the name “Edward Snowden.”

“I would also be terrified of a world where our government decided mass domestic surveillance was ok,” Altman wrote at one point. “I don’t know how I’d come to work every day if that were the state of the country/Constitution.”

OpenAI users rightfully viewed Altman’s feigned ignorance as an insult to their intelligence. “You cannot post the statements by an Administration that is known to lie and expect people to have trust or confidence in [you or your company],” one fumed.

At the end of the day, even Altman couldn’t deny how the PR disaster he had created for himself. The DoD deal, he admitted, “was definitely rushed, and the optics don’t look good.”

More on OpenAI: Anthropic Blowout With Military Involved Use of Claude for Incoming Nuclear Strike

The post Sam Altman in Damage Control Mode as ChatGPT Users Are Mass Cancelling Subscriptions Because OpenAI Is “Training a War Machine” appeared first on Futurism.

πŸ”— Sumber: futurism.com


πŸ€– Catatan TOPINDIATOURS

Artikel ini adalah rangkuman otomatis dari beberapa sumber terpercaya. Kami pilih topik yang sedang tren agar kamu selalu update tanpa ketinggalan.

βœ… Update berikutnya dalam 30 menit β€” tema random menanti!