Skip to content
Deepwater AI Summit 2025
Artificial Intelligence, Venture Capital, Venture Portfolio

 

Deepwater was grateful to host the Deepwater AI Summit in Minneapolis recently with 120 business leaders in attendance representing startups to Fortune 500 companies. We had in-depth discussions on leveraging AI in the enterprise, investing in AI and what you can do to move your AI efforts forward.

We had a fantastic group of companies speak with representatives from OpenAI, Databricks, Invisible Technologies, Anthropic, Fastino, CTGT, Chief, Not Diamond, and Hydra.

Below are the highlights and takeaways from the event:

Fireside Chat Summaries

Will Saborio, Sales Leader at OpenAI, a developer of frontier AI models and maker of ChatGPT, moderated by Mike Olson (Deepwater) and Gene Munster (Deepwater)

  • Serious scale. ChatGPT now serves roughly 10% of the world’s population on a very frequent basis.
  • Use cases are converging. Personal vs professional use of AI is a false dichotomy. Eventually, that line will blur even further. This is a major challenge in tailoring the product offering and sales motion to the enterprise and the consumer. In 10 years, ChatGPT will be an ever-present assistant that exists by your side in all contexts.
  • Enterprise ROI takes focus and effort. The models are probabilistic by nature, which makes using them in deterministic fields like finance or healthcare a tricky proposition. In order to get real utility for your organization, deployments require training both the model and your employees.

 

Matt Fitzpatrick, CEO at Invisible, building custom AI applications for large enterprises, moderated by Doug Clinton (Deepwater)

  • AI in the enterprise is still early. Only 6-8% of AI models make it into real enterprise workflows. AI is not plug-and-play, it requires a thoughtful plan, tailored data, and production infrastructure.
  • Data is the gating factor. 70% of U.S. software is 20+ years old. Most legacy IT systems aren’t built for AI. Success depends on structuring data and processes before using an AI model.
  • Human-in-the-loop models are today’s reality. Invisible’s “cyborg workflow” blends AI with human effort to ensure reliable, deployable outcomes for complex tasks. This hybrid approach is currently the most viable type of enterprise AI.
  • Measuring AI’s success is tricky but essential. Clear ROI objectives, iterative portfolio-style experimentation, and hands-on engineering are more effective than top-down strategies or expecting instant transformation.

 

Josh Lillie, Director, Startups at Databricks, an AI-native data cloud platform, moderated by Andrew Murphy (Deepwater)

  • Enterprise traction is real. Databricks is one of the leading companies in helping organizations adopt AI. They currently work with more than 60% of the Fortune 500.
  • Find your edge first. Databricks advises business leaders to figure out your unique advantage and your most important challenges as a first step to building great AI applications.
  • Easy integration matters. Databricks integrates with your existing technology to give customers maximum flexibility in their infrastructure decisions.
  • Your data is growing and must be used wisely. Data within organizations continues to grow at a staggering rate. As companies adopt AI, it is important to manage and leverage your unique data.

Top 10 Takeaways for Business Leaders

  1. AI use cases are still rudimentary, underscoring how early we are, suggesting that now is the time to act.
  2. AI + human collaboration is key for successful enterprise deployment.
  3. Your unique enterprise data is your greatest AI asset.
  4. Focus on operationalizing AI to target critical business problems.
  5. Plan to measure quantifiable business outcomes from AI projects.
  6. Rigorous evaluation and testing help to build trust in AI systems within your team.
  7. Successful AI implementation requires organizational buy-in, especially from management.
  8. Specialized AI models and applications will drive specific business value.
  9. The future of AI likely involves using multiple models to optimize a single AI solution.
  10. Enterprises may need to build custom AI solutions instead of buying off-the-shelf solutions.

Panel Discussion Summaries

Aaron Ginn, CEO of Hydra, a GPU rental marketplace, and Chris Chen, Head of Revenue at Not Diamond, an AI model routing and optimization platform, moderated by Doug Clinton (Deepwater)

The price and performance of AI models will improve with intelligent routing services.

  • Model Routing: Ultimately, models and access to GPUs will be commoditized. Once they are, then the routing to the most appropriate model or GPUs for a given use case becomes critical.
  • Enterprise AI Adoption Priorities: As enterprises adopt AI, they think about speed, accuracy, and cost, in that order. First, companies look for access to GPUs and models that will get their product into market the quickest. Once products are developed, performance becomes the driving factor. Finally, companies will look to optimize cost.
  • Model Landscape: Open-source models are pulling away from closed-source models. On top of this, the open-source models from China are ahead of those from Meta. OpenAI may take steps towards moving to open-source in the future.

 

Cyril Gorlla, CEO of CTGT, building truly intelligent AI, and AJ Christensen, Head of Product at Fastino, developing task-optimized AI models, moderated by Mike Olson (Deepwater)

As the AI ecosystem matures, we will witness a fundamental shift in model design and implementation.

  • Specialists replacing generalists: We are moving from a world with a few dominant models to a world with many models, each calibrated for a specific use case.
  • We’re living in the 90s: The current paradigm of several large models intermediated by chatbots and complex prompting will eventually seem archaic – like the internet of the early 90s does today.
  • Every aspect will be optimized: This change will be brought about by the whole AI ecosystem maturing. We will create new techniques, like those developed by Fastino and CTGT, that improve efficiency, optimize the user interface, and tailor the experience to each consumer.
  • Efficient = resilient and cheap: Many enterprise AI tasks involve calling a model API 20+ times. Most of these smaller tasks don’t need to be done by an LLM with billions of parameters. Rearchitecting this process lowers the cost, quickens the process, and increases resilience.

 

Bret Larsen, CEO at Chief, automating business operations, and Ryan Libster, Head of Sales, Claude for Work, at Anthropic, building frontier AI systems, moderated by Andrew Murphy (Deepwater)

Successful AI implementation isn’t about the model – it’s about preparing data, aligning with human workflows, and solving real problems through an outcome-driven approach.

  • Data is the key: The most valuable input for AI is your own business data—AI becomes powerful when trained on context-rich, outcome-driven operations specific to your workflows.
  • Adoption > technology: Tools aren’t the hard part—getting leadership buy-in, managing employee expectations, and embedding AI in day-to-day processes are what make or break implementation.
  • Track ROI from the start: Combine soft (productivity, focus shifts) and hard metrics (time saved, outcome speed) to measure effectiveness and refine use cases.
  • Interface and usability matter: As the tech matures, prompt engineering will become less critical, but intuitive interfaces and seamless access to enterprise knowledge will be vital.

Disclaimer

Back To Top