Oct. 20, 2025

He "kind of" had PMF for 8 years—until, after a rebuild, he grew 10x to $25M ARR. | Ben Alarie, Founder of Blue J

He "kind of" had PMF for 8 years—until, after a rebuild, he grew 10x to $25M ARR. | Ben Alarie, Founder of Blue J

Ben Alarie spent 8 years building Blue J with "partial product market fit"—real customers, real revenue, but no real market pull. Then he made a bet that would either kill the company or 10x it: he put the existing product in maintenance mode and gave his team 6 months to rebuild everything from scratch using a technology that barely worked. Two years later, Blue J went from $2M to $25M in ARR. They're adding 10 new customers every single day. NPS went from 20 to 84. This isn't a story about ...

Ben Alarie spent 8 years building Blue J with "partial product market fit"—real customers, real revenue, but no real market pull. Then he made a bet that would either kill the company or 10x it: he put the existing product in maintenance mode and gave his team 6 months to rebuild everything from scratch using a technology that barely worked.

Two years later, Blue J went from $2M to $25M in ARR. They're adding 10 new customers every single day. NPS went from 20 to 84.

This isn't a story about getting lucky. It's about a founder who knew—with absolute conviction—that the market would eventually arrive, and made sure he was ready when it did. But it's also about the danger of fooling yourself into thinking you have PMF when you only "kind of have PMF."

Why You Should Listen:

  • Learn the brutal difference between fake and real PMF
  • Discover when to abandon millions in existing ARR to go all-in on something else
  • Why "time to value" might be the single most important metric for word-of-mouth.
  • See what it takes to survive until the market is ready.

Keywords:

startup podcast, startup podcast for founders, product market fit, founder journey, early stage startup, startup pivot, AI startup, SaaS growth, founder advice, hypergrowth startup

Chapters:

(00:02:00) Starting BlueJ
(00:9:26) Introducing AI to Tax Research
(00:12:44) Starting to Build
(00:17:03) Not Having True PMF
(00:19:44) Believing in Retrieval Augmented Generation
(00:25:34) Updating to V2 of BlueJ
(00:30:58) The Necessity of Time to Value
(00:33:47) When You Knew You Have PMF
(00:38:19) One Piece of Advice

Send me a message to let me know what you think!

02:00 - Starting BlueJ

09:26 - Introducing AI to Tax Research

12:44 - Starting to Build

17:03 - Not Having True PMF

19:44 - Believing in Retrieval Augmented Generation

25:34 - Updating to V2 of BlueJ

30:58 - The Necessity of Time to Value

33:47 - When You Knew You Have PMF

38:19 - One Piece of Advice

WEBVTT

00:00:00.239 --> 00:00:03.600
What we learned was there was partial product market fit.

00:00:03.759 --> 00:00:16.960
The models that we built were actually functionally very useful when you had one of those kinds of cases because they would give you directionally a really good sense of how to handicap your odds if something were to go into litigation or be challenged by the government.

00:00:17.199 --> 00:00:19.600
And he said, This is a tricky problem.

00:00:19.760 --> 00:00:22.559
It took us two weeks to do this internally.

00:00:22.719 --> 00:00:23.440
Can you try this?

00:00:23.519 --> 00:00:28.559
And he described the question and I push enter, and I'm like, okay, like let's see what Blue Jay comes up with.

00:00:28.640 --> 00:00:31.440
And Blue Jay started bringing in the answer.

00:00:31.679 --> 00:00:36.240
This guy stood up, walked up to the screen, and he was like reading it line by line.

00:00:36.399 --> 00:00:38.880
And he said, That's the answer we came up with.

00:00:39.039 --> 00:00:40.560
It was fully conversational.

00:00:40.799 --> 00:00:42.719
You didn't have to wait nearly as long for an answer.

00:00:42.799 --> 00:00:45.520
It was like 15 seconds rather than 90 seconds.

00:00:45.679 --> 00:00:52.240
And we grew from like roughly 2 million in ARR to just shy of$9 million in ARR by the end of 2024.

00:00:52.399 --> 00:00:54.719
It's like, okay, now this is really working.

00:00:56.719 --> 00:00:58.320
That's product market fit.

00:00:58.479 --> 00:00:59.439
Product market fit.

00:00:59.600 --> 00:01:00.399
Product market fit.

00:01:00.560 --> 00:01:02.159
I called it the product market fit question.

00:01:02.320 --> 00:01:03.039
Product market fit.

00:01:03.200 --> 00:01:03.920
Product market fit.

00:01:04.239 --> 00:01:05.200
Product market fit.

00:01:05.359 --> 00:01:06.319
Product market fit.

00:01:06.560 --> 00:01:08.719
I mean, the name of the show is product market fit.

00:01:08.959 --> 00:01:11.439
Do you think the product market fit show has product market fit?

00:01:11.599 --> 00:01:13.920
Because if you do, then there's something you just have to do.

00:01:14.000 --> 00:01:15.359
You have to take up your phone.

00:01:15.519 --> 00:01:17.359
You have to leave the show five stars.

00:01:17.519 --> 00:01:20.879
It lets us reach more founders and it lets us get better guests.

00:01:21.040 --> 00:01:21.840
Thank you.

00:01:22.959 --> 00:01:24.159
Ben, welcome to the show, man.

00:01:24.319 --> 00:01:24.879
Thanks, Pablo.

00:01:25.280 --> 00:01:28.640
Most people I interview on the show, I just meet right before I interview them.

00:01:28.799 --> 00:01:31.439
We've been investors in uh in your company for a while now.

00:01:31.519 --> 00:01:38.640
I think we invested in like 2017, 2018, and you've been working on this uh since 2015, but you hit a major, major inflection point.

00:01:38.879 --> 00:01:40.640
When would you say like was that a year and a half ago?

00:01:40.719 --> 00:01:41.519
Two years, is that right?

00:01:41.680 --> 00:01:43.120
Yeah, I would say about two years ago.

00:01:43.280 --> 00:01:45.120
And you just raised what was the latest round?

00:01:45.359 --> 00:01:53.280
Uh we raised a Series D round, and that closed in July of 2025, and it was 122 million USD.

00:01:53.519 --> 00:01:55.120
So let's start at the beginning, right?

00:01:55.200 --> 00:01:57.120
You started this in 2015.

00:01:57.280 --> 00:01:58.319
What were you doing at that time?

00:01:58.400 --> 00:02:00.640
And what did you see that made you start Blue Jay in the first place?

00:02:00.879 --> 00:02:03.760
You know, I I started out as a tax law professor.

00:02:03.840 --> 00:02:06.959
I'm still I'm still a tax law professor at the University of Toronto.

00:02:07.040 --> 00:02:22.000
But I I had gone to undergrad and studied economics, finance, philosophy, went into law school at the University of Toronto, did a graduate degree in economics concurrently with my law degree, and then went on to graduate studies in law at Yale Law School.

00:02:22.080 --> 00:02:27.120
And then I moved to Ottawa and I was a law clerk at the Supreme Court for a year.

00:02:27.280 --> 00:02:33.039
And immediately after that, I became a tenure-track law professor at the University of Toronto at age 26.

00:02:33.120 --> 00:02:34.240
I think maybe the youngest.

00:02:34.479 --> 00:02:35.039
How does that happen?

00:02:35.120 --> 00:02:36.000
Because that's not, yeah, not.

00:02:36.400 --> 00:02:40.000
No, maybe the youngest tenure-track law prof in the history of the law school.

00:02:40.240 --> 00:02:44.400
I think uh a whole bunch of things went really my way on that one.

00:02:44.560 --> 00:02:47.280
I I had an interesting research agenda.

00:02:47.520 --> 00:02:53.759
I had published some articles as a law student, and I was super keen on the academic path.

00:02:53.919 --> 00:03:00.319
I was really excited about the life of the mind and doing research and going deep on tax law.

00:03:00.400 --> 00:03:04.240
And so that's what I spent the ensuing decade really focused on.

00:03:04.400 --> 00:03:14.319
So from 2004 right through to when I started Blue Jay, I was full tilt into becoming the best possible tax law academic that I could be.

00:03:14.400 --> 00:03:17.039
And so I wrote dozens of academic articles.

00:03:17.199 --> 00:03:23.919
I co-authored several editions of the leading tax book used to teach tax in Canadian law schools.

00:03:24.000 --> 00:03:26.319
Um, the book's called Canadian Income Tax Law.

00:03:26.479 --> 00:03:32.080
And I co-authored with the lead author David Duff, the second, third, fourth, fifth, and sixth editions of that book.

00:03:32.319 --> 00:03:32.800
Did you like it?

00:03:32.879 --> 00:03:34.879
Like, was it everything you kind of expected it to be?

00:03:34.960 --> 00:03:36.080
The professor life?

00:03:36.400 --> 00:03:38.560
The professor life is an amazing life.

00:03:38.719 --> 00:03:41.759
It's uh you get to research things that you're interested in.

00:03:41.919 --> 00:03:47.520
The students are evergreen and very, very talented and very ambitious.

00:03:47.759 --> 00:03:52.879
And the classroom discussion is always interesting and engaging and challenging.

00:03:53.120 --> 00:03:58.879
And if you approach law teaching with the right attitude, you learn as much from your students as as they learn from you.

00:03:59.039 --> 00:04:03.120
And so, yes, being a law professor is is an amazing privilege.

00:04:03.280 --> 00:04:04.319
I really loved it.

00:04:04.400 --> 00:04:05.439
And I still do love it.

00:04:05.599 --> 00:04:09.840
I may return to it post-Blue Jay life on a full-time basis.

00:04:10.000 --> 00:04:21.439
But it was it was in the context of actually becoming associate dean of the law school and leading a reform of the curriculum that got me thinking about AI in the first place.

00:04:21.600 --> 00:04:23.040
And so I petitioned the dean.

00:04:23.120 --> 00:04:37.199
I said, if I'm going to become associate dean, uh, one of the things I'd like to do is to lead reform of the JD curriculum because I had just been through the law school curriculum from 1999 to 2002 when I was a law student.

00:04:37.360 --> 00:04:41.680
And I felt like it could be done in a different and probably a more effective way.

00:04:41.920 --> 00:04:42.879
Was it just like outdated?

00:04:42.959 --> 00:04:44.720
Like, I mean, a lot of education I think is outdated.

00:04:44.800 --> 00:04:52.959
Was that kind of just the the general one of the characteristic things about the curriculum at the time was it was evaluated through virtually entirely 100% final examination.

00:04:53.040 --> 00:05:00.639
So you would start the first year of law school in September, and you would have virtually no evaluations that counted for anything until April.

00:05:00.720 --> 00:05:06.959
And then you'd write you'd set a series of six or seven final exams, each of which would be worth 100% of the final grade.

00:05:07.040 --> 00:05:07.279
Wow.

00:05:07.439 --> 00:05:25.680
And so you wouldn't really know how well you were doing until you got your grades sometime in the summer following your first year of law school, which seemed like not the best way to allow people to learn and iterate as to their learning approaches and really figure out how well they are learning the law.

00:05:25.920 --> 00:05:31.439
And so it was as a result of co-leading this reform that I started thinking about AI.

00:05:31.519 --> 00:05:33.360
And it happened pretty innocently.

00:05:33.439 --> 00:05:39.839
I I asked one of my colleagues um what they thought about the curriculum reform, and they said, Ben, you really shouldn't do this.

00:05:40.079 --> 00:05:41.360
I said, Oh, why?

00:05:41.519 --> 00:05:57.680
They said, Well, the last time we had a major reform of the law school curriculum was in the early 70s, 1970s, and it led to some hard feelings among some faculty who never really subsequently got along after that effort as well as they did before the effort.

00:05:57.839 --> 00:06:02.560
They said it can just people feel strongly about the curriculum and how they approach it.

00:06:02.639 --> 00:06:07.279
And so it's a bit of a minefield, and you'd be better off just focusing on your scholarship.

00:06:07.439 --> 00:06:14.720
And I didn't take the lesson from that that I think that you know, colleague who is hoping to be helpful was wanting to deliver, which is like just kind of back off.

00:06:14.800 --> 00:06:16.560
Instead, I thought, oh, that's really interesting.

00:06:16.800 --> 00:06:20.160
It's been 40 years since somebody tried to do this.

00:06:20.319 --> 00:06:23.920
And I thought, okay, for one thing, it's probably far overdue.

00:06:24.000 --> 00:06:29.519
We should do it again and move to change the curriculum, how we're approaching teaching the law.

00:06:29.680 --> 00:06:43.680
And the second thing I took away from it was this observation, which was, well, if it took 40 years for someone like me to show up and suggest that we should really do a deep dive into changing the curriculum, it may be another 40 years before someone like me shows up and wants to do it again.

00:06:43.759 --> 00:06:48.560
And if if that's true, and I had this thought, if that's true, then what does that mean?

00:06:48.720 --> 00:06:57.040
And I I immediately started thinking about well, AI and machine learning and like just trying to go to the future, like not the present, but kind of like what's coming next.

00:06:57.199 --> 00:07:01.759
Yeah, like let's imagine what's what is kind of the societal backdrop 40 years from now.

00:07:01.920 --> 00:07:04.959
So at that point, it would have been like thinking about the year 2050.

00:07:05.120 --> 00:07:14.000
What's probably going to be true in the year 2050 that we should be taking into account if we're going to rebuild this curriculum, what should we be planning for to make it robust to?

00:07:14.079 --> 00:07:25.519
And I was like, well, Moore's law has been pretty robust, and you know, computing power is doubling um every couple of years, and its affordability is increasing, it, you know, improving exponentially as well.

00:07:25.600 --> 00:07:27.519
And it's like, okay, so that's coming.

00:07:27.600 --> 00:07:36.240
And then I thought about the fact that like legal information is even at that time was virtually all digital, and algorithms were improving significantly.

00:07:36.319 --> 00:07:43.040
So I was aware of some of the work that Jeff Hinton and other colleagues were doing in the computer science department at the University of Toronto.

00:07:43.120 --> 00:07:49.839
And I kind of felt like, oh man, there is a freight train with AI on the front of it coming directly for the law.

00:07:50.000 --> 00:08:03.040
And thought, wow, this has big implications for this curriculum reform, but also for you know the legal system generally, for the judiciary, and also selfishly and closer to home, like for me personally.

00:08:03.199 --> 00:08:16.000
And so I started to think about okay, what if I'm standing at the front of a law school classroom in the year 2040, thinking, oh man, I saw this coming in 2012, 2013, 2014.

00:08:16.240 --> 00:08:19.439
And I decided not to change my professional trajectory.

00:08:19.519 --> 00:08:22.079
And I just decided to stay on the same path.

00:08:22.160 --> 00:08:28.240
And I started to think I would probably feel some profound regret about that if I weren't to do anything about it.

00:08:28.319 --> 00:08:31.199
And then I thought, well, what is it that I should really do about it?

00:08:31.360 --> 00:08:37.279
I mean, okay, changing the curriculum is one thing, and we're working on that and introduce the curriculum reform.

00:08:37.360 --> 00:08:39.279
And I think it's been successful.

00:08:39.440 --> 00:08:44.399
You know, we have another more than a decade of experience with the reform curriculum, and I think it's been successful.

00:08:44.559 --> 00:08:48.080
But I started thinking, okay, so what does this mean for tax law?

00:08:48.240 --> 00:08:52.639
And I'd been through this exercise of revising several editions of this tax book.

00:08:52.799 --> 00:08:59.600
It's like a 1500-page introductory text on Canadian income tax law, and it's such a manual job to update it.

00:08:59.679 --> 00:09:07.440
There are new cases coming down from the tax court, from the federal court all the time, the Supreme Court, the legislation is being frequently amended by Parliament.

00:09:07.600 --> 00:09:09.440
Things are always changing in tax law.

00:09:09.519 --> 00:09:14.480
And so you have to go through and you have to laboriously research each point to see what has happened.

00:09:14.720 --> 00:09:22.320
You and it's always true that no sooner is that new edition of the book published than the law has changed and it's already out of date.

00:09:23.120 --> 00:09:25.679
I'm like, there's gotta be a better way to do this.

00:09:25.840 --> 00:09:32.399
I was like, okay, well, so clearly there's a role for AI in improving tax research.

00:09:32.639 --> 00:09:34.480
Okay, so what does that mean for me?

00:09:34.559 --> 00:09:37.200
And I was like, well, somebody's going to improve us.

00:09:37.519 --> 00:09:39.360
I started to think, well, who's that gonna be?

00:09:39.519 --> 00:09:41.519
I started to wonder, well, might it be me?

00:09:41.840 --> 00:09:42.320
Surely not.

00:09:42.480 --> 00:09:45.519
Like, surely it's not a law professor who's gonna be leading this effort.

00:09:45.679 --> 00:09:47.279
Like, surely it's gonna be somebody else.

00:09:47.360 --> 00:09:49.840
But I thought about the different classes of people that it might be.

00:09:49.919 --> 00:10:03.360
It might be a tax law partner in a big firm, but that's so extraordinarily unlikely because they already have clients that they're servicing, they're part of a partnership, they're kind of bought in, they kind of have dedicated their lives to that path.

00:10:03.519 --> 00:10:10.879
It could be somebody who's doing tax at an accounting firm, but it's a very similar, it's a professional services role and they're on that path.

00:10:11.039 --> 00:10:15.120
They have clients, um, they're they enjoy like a relatively good living.

00:10:15.279 --> 00:10:20.879
And what draws somebody into that kind of career path is it's a known path and it's relatively predictable.

00:10:20.960 --> 00:10:25.679
And they are not dispositionally typically going to be the kinds of people who are like, oh, I know what I'm gonna do.

00:10:25.759 --> 00:10:32.720
I'm gonna do a tech startup and kind of roll the dice and see if they can make it work because the opportunity cost of doing that would be so significant.

00:10:32.879 --> 00:10:42.480
And so I thought, well, okay, it's not gonna be somebody with a tax law background, might it be somebody who's more kind of pure tech, like an AI or machine learning background?

00:10:42.639 --> 00:10:48.639
And again, I thought, well, probably not, because they would need to know a lot more tax law than than they do.

00:10:48.799 --> 00:10:53.039
It would take them a decade to catch up to where I was in my understanding of tax law.

00:10:53.120 --> 00:11:00.639
And probably they're not just not all that interested in learning tax law, and they wouldn't really understand the pain of tax research, right?

00:11:00.720 --> 00:11:06.720
Like they'd probably just prefer to go work for one of the FANG companies or do a startup in in some other area.

00:11:06.879 --> 00:11:09.679
And so it's probably not going to be one of those folks.

00:11:09.840 --> 00:11:12.639
It's like, well, maybe, maybe it is, maybe it could be me.

00:11:12.799 --> 00:11:21.120
And so talking to a number of my colleagues at the law school and a couple really impressive people, Anthony Nibblett and Albert Eun.

00:11:21.279 --> 00:11:24.480
Anthony Nibblett has business and law degrees from the University of Melbourne.

00:11:24.559 --> 00:11:34.080
He grew up in Australia, a PhD in economics from Harvard, and taught at the University of Chicago Law School for a number of years before joining the faculty at the University of Toronto.

00:11:34.240 --> 00:11:45.519
And Albert Eun, who grew up in upstate New York, went to Yale College and then went to Stanford, did his law degree at Stanford, his PhD at Stanford, and then taught at Northwestern for several years before joining the faculty.

00:11:45.679 --> 00:11:49.120
And you know, we'd frequently have lunch together and talk about ideas.

00:11:49.200 --> 00:11:56.399
And said, guys, like this is a strange idea, but I think we should start a tech company to leverage AI to do tax research.

00:11:56.559 --> 00:11:59.679
Both of them were like, okay, Ben, we're on board, but you have to take the lead.

00:11:59.840 --> 00:12:01.440
I'm like, okay, that's fine.

00:12:01.519 --> 00:12:03.840
Uh, you guys are really impressive, brilliant.

00:12:04.000 --> 00:12:12.639
I can't imagine having better co-founders in terms of intellectual capabilities and being really great people to spend a lot of time with.

00:12:12.879 --> 00:12:23.039
So we started Blue Jay, and it happened to coincide with the end of my term as associate dean, which was great because I was entitled to a year of administrative leave, essentially a sabbatical after that.

00:12:23.200 --> 00:12:28.559
And so I could like just throw myself into this for a year on a risk-free basis, kind of Pablo.

00:12:28.720 --> 00:12:32.080
So I was like, this is great because I don't have a huge opportunity cost.

00:12:32.159 --> 00:12:37.200
I had this time that I can dedicate to doing whatever it takes to get this off the ground.

00:12:37.440 --> 00:12:44.080
Did you know what you wanted to build, or you just knew it was AI and tax law and you just had to explore the space to figure out what you might build?

00:12:44.320 --> 00:12:48.399
What I wanted to build from the outset is what we've now eventually been able to produce.

00:12:48.639 --> 00:12:49.759
Ten years later, yeah.

00:12:50.000 --> 00:12:51.360
Yeah, and so it took 10 years.

00:12:51.519 --> 00:12:57.919
But what I originally wanted to build out of the gate is a system that could automate all of tax law research.

00:12:58.080 --> 00:13:09.360
So you could just type any arbitrary tax research question into Blue Jay, and Blue Jay would just produce, you know, a world-class answer leveraging primary sources of law.

00:13:09.440 --> 00:13:21.519
So like legislation and case law and and maybe commentary, academic commentary, professional commentary on the topic, and and give you essentially a full legal memo, a takedown of that particular question.

00:13:21.679 --> 00:13:25.759
That was utterly science fiction in 2015 when we started.

00:13:26.000 --> 00:13:28.960
So we had to start with what we could actually get to work reliably.

00:13:29.039 --> 00:13:38.000
And what it what it turned out we could get to work reliably was supervised machine learning models that could predict how courts would resolve certain kinds of tax questions.

00:13:38.159 --> 00:13:41.919
And so there are a number of these examples throughout income tax law.

00:13:42.159 --> 00:13:47.919
One example is whether a worker is an independent contractor or an employee for tax purposes.

00:13:48.080 --> 00:13:50.240
That's like a nice conventional example.

00:13:50.320 --> 00:14:02.799
And there have been hundreds and hundreds of these cases decided by the courts over the years, and it's really tricky because you could have an agreement that says you are an independent contractor and there's not an employer-employee relationship here.

00:14:02.960 --> 00:14:22.879
And the courts will ignore that agreement and say, actually, for tax purposes, this worker really is an employee, because they'll look at the functional relationship between the hirer and the worker and say, no, there's too much control exerted over the worker's work by the hirer here for us to ignore that and say that this person really is an independent contractor.

00:14:22.960 --> 00:14:45.759
And so they'll reclassify that relationship, and that has big consequences because then the employer, the deemed employer, should have been withholding income tax at source, should have been withholding EI and CPP contributions, payroll taxes on that worker's earnings, should be paying the CPP and EI matching contributions, the employer matching contributions.

00:14:45.840 --> 00:14:52.240
So there are like really draconian tax consequences that flow from that reclassification of that relationship.

00:14:52.480 --> 00:14:53.200
That's one example.

00:14:53.360 --> 00:15:01.360
So we built a bunch of these different predictive models, which performed extremely well, like better than we thought they could perform.

00:15:01.440 --> 00:15:06.480
And so we're predicting with better than 90% accuracy how courts would resolve these kinds of cases.

00:15:06.720 --> 00:15:11.120
And you could do this across countries, or how did you have to focus in on specific states?

00:15:11.440 --> 00:15:12.720
We started in Canada.

00:15:12.879 --> 00:15:15.039
So it's Canadian federal income tax law.

00:15:15.120 --> 00:15:17.200
So it's essentially the same across the country.

00:15:17.279 --> 00:15:21.360
There are some complications with Quebec, which uses the civil law rather than the common law.

00:15:21.440 --> 00:15:26.559
But it turns out that those cases are just as predictable as they are in the common law provinces as well.

00:15:26.720 --> 00:15:28.399
And so we were off to the races.

00:15:28.559 --> 00:15:32.399
But we learned, and and I think this is key to the topic of the show.

00:15:32.559 --> 00:15:36.080
What we learned was there was partial product market fit.

00:15:36.159 --> 00:15:55.840
The models that we built were actually functionally very useful when you had one of those kinds of cases that you were analyzing, either as a tax lawyer or as a tax accountant, or even in government, those models were very good because they would give you directionally a really good sense of how to handicap your odds if something were to go into litigation or be challenged by the government.

00:15:56.080 --> 00:15:57.759
Who are the customers, by the way, just to set context?

00:15:57.840 --> 00:16:00.559
Like you're selling this to governments or are you selling this to law firms?

00:16:00.799 --> 00:16:04.080
Yeah, law firms, uh accounting firms, and in government.

00:16:04.240 --> 00:16:05.919
Had some conversations with corporations.

00:16:06.080 --> 00:16:11.840
That was a tougher sell for the tax product because they just really didn't tend to have the same frequency of tax work.

00:16:12.000 --> 00:16:19.039
The biggest limitation of V1 of Blue Jay is that we only had like these issue by issue models.

00:16:19.200 --> 00:16:27.519
And so they were really good at what they did, but it didn't satisfy the original vision, which was something that could handle any tax research question that you threw at this thing.

00:16:27.679 --> 00:16:28.879
And so we grew it.

00:16:28.960 --> 00:16:29.840
It was encouraging.

00:16:30.000 --> 00:16:31.360
Like the progress was encouraging.

00:16:31.519 --> 00:16:33.679
This is why um astral invested.

00:16:33.840 --> 00:16:38.799
It's why we were able to raise a pre-seed round, a seed round, a series A, a series B.

00:16:39.039 --> 00:16:47.120
Walk me through, because this is really important from a position of not having full product market fit, which is you got to a few million in ARR at a good pace.

00:16:47.200 --> 00:16:55.679
Like as a founder that never done this before, you would think, okay, if I can get a few million ARR with no resources, then for sure if I raise more and I've got more stuff going, I'll get to the next phase.

00:16:55.759 --> 00:16:56.799
Like that's gotta be easier.

00:16:56.879 --> 00:16:58.480
And sometimes, like, it's not, right?

00:16:58.559 --> 00:17:00.080
And it kind of happened to you for a while there.

00:17:00.159 --> 00:17:02.720
Like, just describe maybe why that is.

00:17:03.039 --> 00:17:13.839
Technically, what we had built was world class, and we could demonstrate it and people could test it and they would trial it and they'd be like, Yeah, this is this technology does what Blue Jay is promising it can do.

00:17:13.920 --> 00:17:18.559
It's actually predicting how these cases are gonna go, and it's fantastic at that.

00:17:18.720 --> 00:17:26.720
What was less appreciated is the behavioral change necessary to adopt that and to actually get the value out of the platform.

00:17:26.880 --> 00:17:34.880
Um, and so one of the big problems with not having a solution that could answer any tax research question is the following would happen.

00:17:34.960 --> 00:17:42.319
And this is a stylized version of what would happen, but people would use it for something that Blue Jay does and they get excited, they're like, oh, this is really great.

00:17:42.480 --> 00:17:46.079
And then that issue would come up again, they'd log in, they'd be like, okay, this is great.

00:17:46.160 --> 00:17:46.880
And they would use it.

00:17:46.960 --> 00:17:53.759
And then they would encounter a slightly different tax issue, and they would come in and we wouldn't have a model for that problem.

00:17:53.920 --> 00:17:55.519
And they'd go, Oh, well, that's too bad.

00:17:55.599 --> 00:17:58.880
I was really hoping Blue Jay would have something on this, and and they don't.

00:17:58.960 --> 00:18:03.359
And so they and then they might try it again and they might have another disappointing experience.

00:18:03.440 --> 00:18:09.680
And then, like, okay, okay, this thing just doesn't cover enough of what I wanted to, and then they forget about they'd never log in again.

00:18:09.839 --> 00:18:12.799
And so the usage was really hard to sustain.

00:18:12.960 --> 00:18:16.000
There were some power users who knew exactly what Blue Jay did.

00:18:16.160 --> 00:18:26.960
They would come in all the time with reasonable frequency and get a ton of value out of the platform, but it wasn't enough of a consistent experience for all the users to really get this thing to take off.

00:18:27.119 --> 00:18:32.400
So, yeah, we got it up to over five million dollars in ARR over several years.

00:18:32.559 --> 00:18:39.680
Um, it was like enough to keep us encouraged, but we knew what the problem was, couldn't answer every tax research problem that people had.

00:18:39.839 --> 00:18:47.680
And so when it was actually prior to Chat GPT coming out, it was earlier that fall, it was like September.

00:18:47.839 --> 00:18:56.000
And I remember the DaVinci 3 version of GPT 3 was released, and I was in the OpenAI playground playing around with it.

00:18:56.079 --> 00:18:57.920
I'm like, oh, this is surprisingly good.

00:18:58.079 --> 00:19:03.920
If you go back into the LinkedIn archives, you can probably find some of the posts that I was making at that time, going, This is really interesting.

00:19:04.000 --> 00:19:06.640
Like, check out what DaVinci 3 is able to do.

00:19:06.799 --> 00:19:23.119
That triggered this thought, okay, maybe there's something with these new large language models that are coming out that we can harness in order to get to this seemingly magical outcome, this magical user experience, which is we can if we can answer any tax research question somebody has, that would be like unbelievably good.

00:19:23.279 --> 00:19:35.279
It was the original vision that we had for the company was like, let's build this thing that's tantamount to magic that could take all of the difficult, challenging work of tax research and really, really automate it.

00:19:35.359 --> 00:19:44.640
And so what was it, by the way, technologically, that in the old, like the supervised learning made you have to build model by model, case by case, but then with LLMs, you know, it could just in theory handle everything.

00:19:44.960 --> 00:19:48.960
The big move was retrieval augmented generation, where you kind of run a search.

00:19:49.039 --> 00:19:53.680
So you have a master corpus of all of the relevant tax research materials.

00:19:53.839 --> 00:20:01.119
You run an intelligent search and you've got like really good vector embeddings and you're chunking all of the stuff and indexing it in just the right way.

00:20:01.359 --> 00:20:12.079
You can find the relevant research resources and then you can synthesize them and generate a very good synthesis of all of the relevant materials and produce an answer.

00:20:12.319 --> 00:20:29.279
But you just could not do that reliably prior to the large language models having sufficient natural language understanding, synthesis capabilities, the token uh limits, especially early on, even with this new approach, were challenging to work within.

00:20:29.440 --> 00:20:32.720
But the supervised machine learning approach was very laborious.

00:20:32.880 --> 00:20:43.599
We had to go through and do a whole bunch of processing of all of the case law and extract the metadata, annotate it, and train these models one by one and update them every time a new case would come out.

00:20:43.680 --> 00:20:47.119
We didn't need to update the models to reflect the new case law.

00:20:47.359 --> 00:20:53.119
The brilliant thing about leveraging uh retrieval augmented generation is we can just curate the data.

00:20:53.200 --> 00:21:09.200
And if we're confident about the currency of the tax database and the content in the database, and we're confident about our retrieval algorithms, our prompt engineering, the whole user interface, the user experience can be dramatically simplified and made really, really slick.

00:21:09.359 --> 00:21:19.039
And so that's one of the lessons that we learned from the packaging of ChatGPT was oh, a system where you come in and people can just type whatever tax research question that they want.

00:21:19.119 --> 00:21:33.759
And we are going to take on the burden of assembling all the right materials, constructing an answer and giving them like a full takedown of the question and provide all of the underlying authoritative legal sources for that answer.

00:21:34.000 --> 00:21:36.400
That's amazing if we can do that, if we can deliver that.

00:21:36.640 --> 00:21:43.680
Why can't I, if I use like the latest thinking model of ChatGPT and I ask it a tax question, like, why is your answer going to be so much better than theirs?

00:21:43.759 --> 00:21:45.119
What are you what are you doing that they're not?

00:21:45.279 --> 00:21:48.079
We have tens of thousands of people who have followed the show.

00:21:48.240 --> 00:21:49.440
Are you one of those people?

00:21:49.599 --> 00:21:50.960
You want to be a part of the group.

00:21:51.039 --> 00:21:53.599
You want to be a part of those tens of thousands of followers.

00:21:53.680 --> 00:21:55.279
So hit the follow button.

00:21:55.599 --> 00:22:09.119
There are a number of things going on there, but you know, increasingly you can ask the general models tax research questions, and you'll often get a like a pretty good answer because it's looking now at web sources and it's able to look at a bunch of stuff.

00:22:09.200 --> 00:22:12.319
But there are there are a few big problems with the general models.

00:22:12.480 --> 00:22:17.920
One is they're not specifically designed with providing expert-level tax research answers.

00:22:18.079 --> 00:22:26.799
And they're relying on web sources, and web sources are going to be quite variable in the quality of what you find there.

00:22:26.880 --> 00:22:28.559
And a lot of it will be out of date.

00:22:28.640 --> 00:22:40.000
And so if you're relying on guidance or web documents from 2022 and 2023 and 2025, sometimes everything works out perfectly and the law hasn't really changed.

00:22:40.079 --> 00:22:48.799
And a horizontal kind of approach, like a perplexity or a chat GPT or a Gemini, will be able to cobble together an answer and it'll be it'll be good.

00:22:48.880 --> 00:22:49.680
It'll be accurate.

00:22:50.000 --> 00:22:57.440
More commonly, there will be some of the materials that those systems find, even on the open web, are out of date, anachronistic.

00:22:57.599 --> 00:22:58.640
The law has changed.

00:22:58.799 --> 00:23:07.759
There's been some statutory amendment, or there's some new case law, or you know, some law firm wrote some guidance and it's now been left up on there.

00:23:07.839 --> 00:23:08.799
It's quite stale.

00:23:08.960 --> 00:23:16.000
And there have been, you know, new rulings by the Canada Revenue Agency or the IRS, and that's not taken into account.

00:23:16.160 --> 00:23:21.680
And it's going to produce an answer that's missing some really important subsequent developments.

00:23:21.759 --> 00:23:23.599
And these models don't know the difference.

00:23:23.680 --> 00:23:26.400
They'll just produce a confident sounding answer.

00:23:26.640 --> 00:23:31.680
And you're not able necessarily to just click into those sources and see the full text.

00:23:31.759 --> 00:23:42.079
If it's on the open web, you can, but that doesn't solve the underlying problem, which is you don't have all of the authoritative documents, many of which are behind paywalls informing the answer.

00:23:42.240 --> 00:23:49.279
And so in tax law, you've got these really terrific broad content collections.

00:23:49.440 --> 00:23:54.480
And traditional publishers have assembled these tax research content collections over decades.

00:23:54.559 --> 00:24:04.400
And so the big publishers are publishers like Thomson Reuters and CCH and LexusNexis and Bloomberg, and they've got these really great comprehensive collections of content.

00:24:04.480 --> 00:24:08.559
And they keep those content collections safely guarded behind paywalls.

00:24:08.640 --> 00:24:10.480
And so it's not on the open web.

00:24:10.559 --> 00:24:15.359
And so those horizontal tools aren't able to access all of that authoritative content.

00:24:15.599 --> 00:24:17.519
So that's on informing the answers.

00:24:17.759 --> 00:24:24.079
We have copies of all the authoritative content necessary to produce really great answers.

00:24:24.240 --> 00:24:32.880
And we make it easy for our users to look at those authoritative sources and validate, okay, this is where this came from, this is where this came from.

00:24:33.039 --> 00:24:41.039
And we're focused exclusively on tax research, which means that the quality is really a cut above what you're going to get from those horizontal solutions.

00:24:41.200 --> 00:24:43.759
And our users are very picky.

00:24:43.920 --> 00:24:51.039
They're discerning users, they're tax professionals who are not content with using, you know, a chat GPT.

00:24:51.200 --> 00:24:52.079
I mean, they have to be right.

00:24:52.160 --> 00:24:52.319
Yeah.

00:24:52.400 --> 00:24:54.559
They can't get stuff that's like 60% right.

00:24:54.799 --> 00:24:56.720
They're looking for kind of certain, yeah.

00:24:56.960 --> 00:24:57.200
Right.

00:24:57.359 --> 00:24:59.039
And time is really valuable to them.

00:24:59.119 --> 00:25:07.680
They do not want to waste time with an inferior product when there's something superior, purpose-built that they can access that's going to give them the best answer.

00:25:07.839 --> 00:25:12.000
Their clients are paying them a lot of money for the time that they're spending doing this research.

00:25:12.160 --> 00:25:19.599
And if they can accelerate that research with tools that cut through that task like a hot knife through butter, they want that hot knife through butter experience.

00:25:19.680 --> 00:25:21.279
And that's why they're turning to blue jay.

00:25:21.440 --> 00:25:26.000
So walking back to that storyline, like you see this just before ChatGPT, kind of this new model.

00:25:26.079 --> 00:25:28.079
You see it could lead to that vision.

00:25:28.240 --> 00:25:28.960
How do you approach it?

00:25:29.039 --> 00:25:30.319
Like, how all in do you go on that?

00:25:30.400 --> 00:25:32.960
Because you have an existing product with existing revenue.

00:25:33.039 --> 00:25:34.000
So how do you balance that?

00:25:34.240 --> 00:25:36.400
Yeah, it took some courage and some conviction.

00:25:36.559 --> 00:25:41.599
The courage and conviction was the conviction was doing what we're doing is not going to scale properly.

00:25:41.839 --> 00:25:47.519
We're just not going to get that breakout product market fit that we need in order for all of this to really work out.

00:25:47.759 --> 00:25:50.000
Partly was that it was partly necessity.

00:25:50.079 --> 00:25:56.559
Like if we're going to make this work, we have to get to some solution that can answer any tax research question people want to ask.

00:25:56.720 --> 00:26:00.240
The other one was, well, do we just forego all of this revenue?

00:26:00.319 --> 00:26:01.759
Because your point is a good one.

00:26:01.839 --> 00:26:06.000
Like we don't just want to tell our existing customers who are paying us good money for access.

00:26:06.240 --> 00:26:09.440
And the models worked, like they were providing value to the customers.

00:26:09.519 --> 00:26:11.359
But here's kind of how we navigated it.

00:26:11.440 --> 00:26:18.960
It was, okay, we're going to put all of our existing tax research tools into kind of maintenance mode and we'll keep servicing the software.

00:26:19.039 --> 00:26:22.480
We'll keep updating it, but we're not going to invest in new feature development.

00:26:22.640 --> 00:26:24.960
We're not going to invest in building new models.

00:26:25.119 --> 00:26:26.640
People are going to get what's in there.

00:26:26.720 --> 00:26:28.000
And we're going to take six months.

00:26:28.160 --> 00:26:31.519
I told the company, we're going to take the first six months of 2023.

00:26:31.680 --> 00:26:39.440
We're going to focus all of our new development efforts in building this thing that can answer any tax research question in US federal income tax law.

00:26:39.599 --> 00:26:41.119
Why US federal income tax law?

00:26:41.279 --> 00:26:42.720
Well, the market's the biggest on earth.

00:26:42.799 --> 00:26:43.759
There are a lot of materials.

00:26:44.079 --> 00:26:53.039
We had the library necessary to build that because of our existing relationship with Tax Notes, which is a nonprofit publisher in the US.

00:26:53.200 --> 00:26:58.799
And so we had access to all of the necessary content, and it was going to be a really great market opportunity.

00:26:59.039 --> 00:27:01.200
And at this point, you had seen like ChatGPT had come out.

00:27:01.279 --> 00:27:06.240
And so you'd seen you saw like the consumer adoption of that product and just how incredible it could be.

00:27:06.559 --> 00:27:07.359
Totally, totally.

00:27:07.519 --> 00:27:17.359
And I'm like, okay, there has to be this kind of version of Chat GPT, but specifically for tax research, we were in the right place at the right time with the right opportunity.

00:27:17.519 --> 00:27:19.359
We had a team of about 50 people.

00:27:19.519 --> 00:27:22.480
We had a decent balance sheet at the time.

00:27:22.720 --> 00:27:24.480
And it was, it was go time.

00:27:24.559 --> 00:27:27.759
We had the data science team, we had the data all prepped.

00:27:27.839 --> 00:27:28.799
We were ready to go.

00:27:28.880 --> 00:27:34.240
And so we made that brave decision to make this pivot and say, we're going to use this new technology.

00:27:34.319 --> 00:27:42.720
We're going to put all of our chips on large language models and see if we can get them to do what we set out to do originally back in 2015 when we started Blue Jay.

00:27:42.799 --> 00:27:51.279
And Pablo, I have to say, like by the end of June, we had something that could answer, you know, US federal income tax questions, but it was a little bit gnarly.

00:27:51.359 --> 00:27:52.319
It was a little bit janky.

00:27:52.480 --> 00:27:56.480
So like half the time there would be some issue with the answer.

00:27:56.559 --> 00:27:57.359
It wasn't quite right.

00:27:57.519 --> 00:27:59.440
Maybe it hallucinated something.

00:27:59.599 --> 00:28:01.920
It took like 90 seconds to produce a response.

00:28:02.000 --> 00:28:04.319
So it would go through the retrieval and then the generation.

00:28:04.480 --> 00:28:06.480
And it was just a single shot interaction.

00:28:06.640 --> 00:28:16.079
So if you, as a user, typed in your tax research question and you waited 90 seconds and you got the answer and you weren't quite happy with it, you couldn't challenge it with a subsequent question.

00:28:16.160 --> 00:28:17.279
It was like you had to start over.

00:28:17.359 --> 00:28:21.039
And so you like cut and paste the prompt and like change it a little bit and then try again.

00:28:21.200 --> 00:28:22.000
It was okay.

00:28:22.240 --> 00:28:26.319
We had about a plus 20 NPS score, which is like merely okay.

00:28:26.480 --> 00:28:31.759
Like it's the bare threshold that you kind of have to get to to have, I think, a saleable SaaS product.

00:28:31.920 --> 00:28:36.799
But we had a long list of things we wanted to do to try to improve the performance of the system.

00:28:36.960 --> 00:28:42.000
So we ended 2023 at just over$2 million in ARR.

00:28:42.079 --> 00:28:43.440
So it was a very successful launch.

00:28:43.599 --> 00:28:45.920
On the new product, you're saying just on the new product.

00:28:46.000 --> 00:28:50.480
So it's like we got to a couple million pretty fast with a pretty janky product.

00:28:50.720 --> 00:28:54.799
2024, lots more innovation, lots more iteration.

00:28:54.960 --> 00:29:00.079
There's, of course, improvement in the underlying foundational models as well over this time, right?

00:29:00.160 --> 00:29:10.240
So we were using GPT 3.5 out of the gate, and then GPT-4, we brought it on board, and that started to lead to improved performance and subsequent model developments, 4.0.

00:29:10.559 --> 00:29:13.519
We are improving retrieval, we're improving our prompting.

00:29:13.599 --> 00:29:15.440
We made it fully conversational.

00:29:15.519 --> 00:29:20.000
And so by the end of 2024, the MPS was around 70.

00:29:20.240 --> 00:29:21.759
It was fully conversational.

00:29:22.000 --> 00:29:23.920
You didn't have to wait nearly as long for an answer.

00:29:24.000 --> 00:29:26.799
It was like 15 seconds rather than 90 seconds.

00:29:26.880 --> 00:29:33.599
And we grew from like roughly 2 million in ARR to just shy of$9 million in ARR by the end of 2024.

00:29:33.759 --> 00:29:35.599
It's like, okay, now this is really working.

00:29:35.759 --> 00:29:36.880
This is really getting excited.

00:29:37.039 --> 00:29:41.599
We were cash flow positive from operations in 2024, which is pretty cool as a startup.

00:29:41.759 --> 00:29:44.880
And so we headed into 2025, very optimistic.

00:29:45.039 --> 00:29:49.759
Fast forward to now, like it's um we're three quarters of the way through 2025 now.

00:29:49.920 --> 00:29:52.880
The MPS on our product is now in the mid 80s.

00:29:53.119 --> 00:29:56.720
It was like trailing 30 days, it was 84, and people love it.

00:29:56.799 --> 00:30:04.799
We've gone from fewer than 100 firms subscribing at The end of 2023 to like 1200-ish at the end of 2024.

00:30:04.880 --> 00:30:09.440
Now we're closing in on 3,400 firms who are signed up to Blue Jay.

00:30:09.519 --> 00:30:14.559
And every day it's like we're adding about another 10 firms, new logos every single day.

00:30:14.720 --> 00:30:15.599
And it's wild.

00:30:15.759 --> 00:30:17.839
So now we're we're in this hypergrowth stage.

00:30:18.000 --> 00:30:19.920
What was your growth in this last kind of year?

00:30:20.160 --> 00:30:23.440
Well, yeah, now we're in the mid-20s millions of ARR.

00:30:23.680 --> 00:30:23.920
Crazy.

00:30:24.079 --> 00:30:28.559
So we've already basically tripled the business from where we ended at the end of 2024.

00:30:28.640 --> 00:30:36.720
And we still have like Q4, which is typically traditionally the strongest quarter for this business ahead of us and a really healthy pipeline.

00:30:36.960 --> 00:30:37.440
That's huge.

00:30:37.599 --> 00:30:40.720
One question, by the way, because I remember this from the older days of Blue Jay.

00:30:40.799 --> 00:30:48.640
And generally in legal tech, a lot of these lawyers, the way that they charge is based on, you know, how much time they spend doing something, like billable hours.

00:30:48.720 --> 00:30:51.039
Like, what's the effect of Blue Jay on that?

00:30:51.119 --> 00:30:52.640
Like, does it lower their billable hours?

00:30:52.720 --> 00:30:57.599
Is it just that the markets change so much they have no choice, or is the impact not that kind of one-to-one?

00:30:57.920 --> 00:31:03.680
I think what's driving it is it really is at least 10 times better than doing tax research the traditional way.

00:31:04.000 --> 00:31:10.559
And so, like the fact that this method exists and it's so readily available and the time to value is like instant.

00:31:10.640 --> 00:31:15.039
Like when people do a trial of this thing, they get in, they try a tax research question.

00:31:15.200 --> 00:31:22.720
And especially if they have just finished, you know, a different tax research task in one of the traditional platforms that they're using, they know what the answer is.

00:31:22.799 --> 00:31:28.480
So when Blue Jay produces the answer in 20 seconds and they can read it, they go, I know that's the right answer.

00:31:28.640 --> 00:31:34.880
And I know how long it just took me to do this and this other thing, which could be six or eight hours, and this thing did it in 20 seconds.

00:31:35.039 --> 00:31:39.599
They're like, okay, I can't justify it to myself to just burn my life in that way.

00:31:39.759 --> 00:31:42.160
Competitors, competing firms are adopting this thing.

00:31:42.400 --> 00:31:44.400
Like, if this thing exists, we kind of have to have it.

00:31:44.559 --> 00:31:46.000
By the way, you mentioned one thing there.

00:31:46.079 --> 00:31:47.279
You mentioned time to value.

00:31:47.359 --> 00:31:48.960
I was talking to another founder recently.

00:31:49.119 --> 00:31:51.920
We were talking about growth and specifically word of mouth growth.

00:31:52.160 --> 00:31:59.039
His take, which I've been thinking a lot since then, is that the the number one biggest thing you can do for word of mouth is time to value.

00:31:59.119 --> 00:32:10.079
If you can put something in somebody's hands and they can get value right away, the likelihood that they're going to tell other people about it is way higher than if it the same value comes, but it takes hours or days or whatever.

00:32:10.240 --> 00:32:11.599
Is that is that true in your experience?

00:32:11.839 --> 00:32:12.400
Absolutely.

00:32:12.559 --> 00:32:25.680
I mean, the the conversion, the trial conversion, the word of mouth is it's so much fun to sell Blue Jay now compared to to V1, where the value is more difficult to unlock and it was less consistent.

00:32:25.920 --> 00:32:28.240
Just think about the social dynamics there, right, Pablo?

00:32:28.319 --> 00:32:45.920
Because I think you you put your finger on something, which is if I am confident that I make this recommendation to you and you can at very low cost go and like validate that for yourself right away, it makes it socially far less risky for me to to suggest this to you because you can validate it for yourself extremely quickly.

00:32:46.000 --> 00:32:52.559
I'm not inviting you to go spend a few days trying to figure something out and then you might resent me for having suggested it.

00:32:52.640 --> 00:32:53.440
No, go try it.

00:32:53.519 --> 00:32:54.160
Just go try it.

00:32:54.240 --> 00:32:55.119
Like it's easy.

00:32:55.279 --> 00:32:56.559
If it works, that's awesome.

00:32:56.640 --> 00:32:59.599
Because then you're gonna be like pleased that I shared this tip with you.

00:32:59.759 --> 00:33:05.440
And if it doesn't work well, it wasn't a huge time commitment on your part to figure out that it wasn't for you anyway.

00:33:05.519 --> 00:33:07.200
But yeah, that's a great way of putting it.

00:33:07.359 --> 00:33:16.880
The consistency of the value, like I'm not asking you to spend to make a huge investment in your time and the upside, like it's so asymmetric, the upside versus the the investment to validate it.

00:33:17.119 --> 00:33:28.640
If you think about the opposite, like if I were to recommend Photoshop to somebody else, I'd actually have to, in order to protect that social dynamic piece, warn them about the fact that it's a great product, but like just before you try it, like it's gonna take a long time to get in.

00:33:29.039 --> 00:33:38.079
Like all these things are gonna have to so that's gonna make the word of mouth so much harder and less likely to happen than if, as you said, you know that they're gonna try it and they're gonna love it right away.

00:33:38.160 --> 00:33:39.279
It's just gonna make you look good.

00:33:39.440 --> 00:33:40.240
Yeah, 100%.

00:33:40.799 --> 00:33:42.720
There's three questions we we typically end on.

00:33:42.799 --> 00:33:47.119
One of them is for you personally, like, when did you feel like you had true product market fit?

00:33:47.359 --> 00:33:52.400
I was doing a demonstration for a group at the Canada Revenue Agency.

00:33:52.480 --> 00:33:54.160
I think it was early 2024.

00:33:54.319 --> 00:34:05.039
And we had just built the Canadian product, and I was invited by somebody at the CRA to come into their Toronto Tax Services office and do a bit of a talk on AI and tax.

00:34:05.200 --> 00:34:14.639
And I did a demo, you know, I ran through a couple examples of questions, but then I turned it over to the audience and said, I put them on the spot and said, Does anyone have any questions?

00:34:14.800 --> 00:34:16.719
Any tax research questions that they have?

00:34:16.880 --> 00:34:19.679
And a guy put up his hand and he said, I have one for you.

00:34:19.760 --> 00:34:22.320
He was sitting right at the front at one of these round tables.

00:34:22.480 --> 00:34:25.840
And it was probably a group of like 150-ish people at the CRA.

00:34:26.000 --> 00:34:27.119
So a pretty big group.

00:34:27.360 --> 00:34:29.760
And he said, This is a tricky problem.

00:34:29.920 --> 00:34:33.440
It took us two weeks to do this internally.

00:34:33.679 --> 00:34:36.079
I don't have my hopes up, but can you try this?

00:34:36.239 --> 00:34:41.199
And he described the question and I push enter, and I'm like, okay, like let's see what Blue Jay comes up with.

00:34:41.360 --> 00:34:44.480
And Blue Jay started bringing in the answer.

00:34:44.800 --> 00:34:49.840
This guy stood up and he like walked up to the screen and he was like reading it line by line.

00:34:50.000 --> 00:34:51.440
That's the answer we came up with.

00:34:51.519 --> 00:34:55.039
Okay, it was a two-week research task, and we had like a bunch of experts.

00:34:55.280 --> 00:34:59.440
And I remember driving back to my office at the law school afterwards.

00:34:59.599 --> 00:35:05.280
I like I turned up the radio and I was just like pretty pumped because I was like, okay, like that was hugely successful.

00:35:05.440 --> 00:35:09.679
That was an ecologically like valid test of Blue Jay.

00:35:09.760 --> 00:35:10.960
And we just hit it out of the park.

00:35:11.039 --> 00:35:13.599
It was unbelievably successful, and it had all the sources.

00:35:13.679 --> 00:35:15.199
And he's like, Those are the right sources.

00:35:15.360 --> 00:35:17.280
And he couldn't believe that it did it.

00:35:17.360 --> 00:35:21.119
And in those demos, the audience always has like some reservations.

00:35:21.280 --> 00:35:36.159
So they're like, Oh, like maybe, maybe this is a cooked-up example, and maybe this is like being presented to to really overpromise what this product can actually do, which is why I think it's always so powerful to say, Well, do you have any research problems that you were working on recently?

00:35:36.320 --> 00:35:39.840
So it I have no idea what you might be might have been working on.

00:35:40.000 --> 00:35:46.559
So to demonstrate it live to a very discerning group, that was really hugely validating.

00:35:46.880 --> 00:35:52.400
You know, that moment you're in the car and you're just you're vibing to the music and you feel like hey, you really have it these days.

00:35:52.480 --> 00:35:57.440
I'm talking to a lot of founders that frankly get a little lucky because they have that moment like a year after launching, right?

00:35:57.519 --> 00:35:59.280
Like there's just so much stuff happening in your case.

00:35:59.440 --> 00:36:01.840
It took like eight years, and you actually mentioned this earlier.

00:36:02.000 --> 00:36:12.960
Like you had when the ChatGPT stuff happened right before like that early 2023, you were in the right time, the right place, you had all the ingredients, and finally there was this inflection point in the market that you could take advantage of.

00:36:13.119 --> 00:36:22.480
But in those eight years, was there a time when, or seven or eight years, whatever it was, like when you questioned the whole operation when you thought maybe it's just not gonna work?

00:36:22.800 --> 00:36:23.599
It's a good question.

00:36:23.760 --> 00:36:27.440
I I would tell code this all the time, like every time I talk to code.

00:36:27.679 --> 00:36:29.679
Code, who by the way is the other partner of my firm.

00:36:29.920 --> 00:36:32.639
Yeah, I'm not talking to computer code, I'm talking to code the human.

00:36:32.800 --> 00:36:36.480
But I'd say code, like I'm gonna make this work or I'm going to die trying.

00:36:36.639 --> 00:36:41.599
Because I had such conviction in the fact that AI could solve this problem.

00:36:41.760 --> 00:36:49.119
It was, it was just a matter of time, which also like that attitude of being all in and I'm gonna do it, or I'll die trying.

00:36:49.199 --> 00:36:58.960
It was hugely kind of, I think, in some ways, the right attitude to set us up for seizing the moment, seizing the opportunity when it came up.

00:36:59.119 --> 00:37:01.039
It's also a little bit nuts, right?

00:37:01.280 --> 00:37:06.719
Because, you know, you can also waste your entire life having conviction about some moment that never arrives.

00:37:06.800 --> 00:37:11.360
And so I wouldn't necessarily recommend that strategy, but I couldn't avoid it.

00:37:11.440 --> 00:37:14.320
I just felt so viscerally that this was going to work.

00:37:14.480 --> 00:37:19.840
And so I would not recommend that attitude to others, but I couldn't help but have that attitude.

00:37:20.000 --> 00:37:21.360
And it was just in me.

00:37:21.440 --> 00:37:24.400
And so, of course, you wonder like, when is this gonna come?

00:37:24.559 --> 00:37:25.440
When is this gonna come?

00:37:25.599 --> 00:37:26.960
I'm certain it is gonna come.

00:37:27.119 --> 00:37:32.159
And when it does come, I sure as heck want it to be me who's gonna seize the opportunity when it does come.

00:37:32.320 --> 00:37:37.360
And and then Pablo, in this case, I I kind of feel extraordinarily fortunate.

00:37:37.519 --> 00:37:41.599
It came within time and we didn't run out of money and we were able to seize the opportunity.

00:37:41.679 --> 00:37:52.239
And now, of course, you know, we're in this hypergrowth phase, which is usually validating and it gets me excited about what we're gonna be able to build for for global tax professionals over the course of the next several years.

00:37:52.320 --> 00:37:55.519
Um, I think it's gonna totally revolutionize global tax.

00:37:55.679 --> 00:37:57.119
And so I'm I'm super excited.

00:37:57.280 --> 00:38:04.400
But we were we were really, really early, and it it was a long eight-year stretch between starting out and and actually getting there.

00:38:04.639 --> 00:38:10.719
Looking back, it's always like, yeah, eight years and then things happen, but but when you're living eight years is a long, it can feel like a very long time.

00:38:10.960 --> 00:38:19.119
What would be kind of your top piece of advice for an early stage founder that is in that kind of finding product market fit phase of the business?

00:38:19.440 --> 00:38:21.440
Every context is so different.

00:38:21.679 --> 00:38:28.079
Maybe the biggest thing is be ruthlessly honest with yourself about whether you have product market fit.

00:38:28.159 --> 00:38:34.960
I think having kind of like muddled through for so many years, I think it was always like we had customers.

00:38:35.119 --> 00:38:38.639
We had like many hundreds of firms signed up to the version one of the platform.

00:38:38.960 --> 00:38:44.159
So it's hard to say that we didn't have product market fit, but we didn't have like really good product market fit.

00:38:44.400 --> 00:38:50.639
It wasn't like this thing where it's like, okay, like this is really, really working and and this is like really, really solid.

00:38:50.800 --> 00:38:54.480
It always kind of felt like kind of had a partial vacuum, a partial seal.

00:38:54.559 --> 00:39:00.800
And it was like wasn't really tight and working in the way that you know you would really want it to work.

00:39:00.960 --> 00:39:20.960
And it was that openness to realizing we weren't quite there, we were close to there, we had a good vision, we we were attacking a worthwhile problem, which is why we got as far as we got, realizing that there's there's a difference when you really hit onto a very strong solution and you have to keep going for it until you get it.

00:39:21.039 --> 00:39:23.280
And then even then, you can't be complacent.

00:39:23.360 --> 00:39:29.039
Like now I'm I'm still, you know, our MPS is in the mid 80s, and it's like, okay, well, we let's keep going.

00:39:29.119 --> 00:39:30.159
How do we make it even better?

00:39:30.320 --> 00:39:39.199
Like, what's what's what are the next landmarks that we have to reach in terms of the product in order to really, really solve tax as a as a domain?

00:39:39.440 --> 00:39:40.960
And so we're still going.

00:39:41.119 --> 00:39:47.360
But it's being really honest about it because you're probably, as a founder, you're the easiest one to deceive.

00:39:47.519 --> 00:39:56.639
And you're like, you can deceive yourself about this and you can have happy ears and listen to the folks who are trying to provide encouragement and telling you, yeah, this is this is really great.

00:39:56.719 --> 00:39:58.480
I could totally see myself using this.

00:39:58.719 --> 00:40:13.199
The real test is like, are they using it aggressively and like every day and thrills and telling everybody else that they know about this thing if that's happening and they're paying real money for it, and that's when you know that that you've got that product market fit.

00:40:13.440 --> 00:40:13.679
Perfect.

00:40:13.760 --> 00:40:15.199
Well, Ben, we'll uh we'll stop it there.

00:40:15.280 --> 00:40:16.559
Thanks so much for jumping on the show, man.

00:40:16.639 --> 00:40:17.119
It's been awesome.

00:40:17.360 --> 00:40:17.760
My pleasure.

00:40:17.840 --> 00:40:18.800
Thanks for having me.

00:40:19.119 --> 00:40:20.880
Wow, what an episode.

00:40:20.960 --> 00:40:23.199
You're probably in awe, you're in absolute shock.

00:40:23.360 --> 00:40:26.239
You're like, that helped me so much.

00:40:26.400 --> 00:40:27.039
So guess what?

00:40:27.119 --> 00:40:29.679
Now it's your turn to help someone else.

00:40:29.920 --> 00:40:37.360
Share the episode in the WhatsApp group you have with founders, share it on that Slack channel, send it to your founder friends and help them out.

00:40:37.599 --> 00:40:39.679
Trust me, they will love you for it.