WEBVTT
00:00:00.239 --> 00:00:03.600
What we learned was there was partial product market fit.
00:00:03.759 --> 00:00:16.960
The models that we built were actually functionally very useful when you had one of those kinds of cases because they would give you directionally a really good sense of how to handicap your odds if something were to go into litigation or be challenged by the government.
00:00:17.199 --> 00:00:19.600
And he said, This is a tricky problem.
00:00:19.760 --> 00:00:22.559
It took us two weeks to do this internally.
00:00:22.719 --> 00:00:23.440
Can you try this?
00:00:23.519 --> 00:00:28.559
And he described the question and I push enter, and I'm like, okay, like let's see what Blue Jay comes up with.
00:00:28.640 --> 00:00:31.440
And Blue Jay started bringing in the answer.
00:00:31.679 --> 00:00:36.240
This guy stood up, walked up to the screen, and he was like reading it line by line.
00:00:36.399 --> 00:00:38.880
And he said, That's the answer we came up with.
00:00:39.039 --> 00:00:40.560
It was fully conversational.
00:00:40.799 --> 00:00:42.719
You didn't have to wait nearly as long for an answer.
00:00:42.799 --> 00:00:45.520
It was like 15 seconds rather than 90 seconds.
00:00:45.679 --> 00:00:52.240
And we grew from like roughly 2 million in ARR to just shy of$9 million in ARR by the end of 2024.
00:00:52.399 --> 00:00:54.719
It's like, okay, now this is really working.
00:00:56.719 --> 00:00:58.320
That's product market fit.
00:00:58.479 --> 00:00:59.439
Product market fit.
00:00:59.600 --> 00:01:00.399
Product market fit.
00:01:00.560 --> 00:01:02.159
I called it the product market fit question.
00:01:02.320 --> 00:01:03.039
Product market fit.
00:01:03.200 --> 00:01:03.920
Product market fit.
00:01:04.239 --> 00:01:05.200
Product market fit.
00:01:05.359 --> 00:01:06.319
Product market fit.
00:01:06.560 --> 00:01:08.719
I mean, the name of the show is product market fit.
00:01:08.959 --> 00:01:11.439
Do you think the product market fit show has product market fit?
00:01:11.599 --> 00:01:13.920
Because if you do, then there's something you just have to do.
00:01:14.000 --> 00:01:15.359
You have to take up your phone.
00:01:15.519 --> 00:01:17.359
You have to leave the show five stars.
00:01:17.519 --> 00:01:20.879
It lets us reach more founders and it lets us get better guests.
00:01:21.040 --> 00:01:21.840
Thank you.
00:01:22.959 --> 00:01:24.159
Ben, welcome to the show, man.
00:01:24.319 --> 00:01:24.879
Thanks, Pablo.
00:01:25.280 --> 00:01:28.640
Most people I interview on the show, I just meet right before I interview them.
00:01:28.799 --> 00:01:31.439
We've been investors in uh in your company for a while now.
00:01:31.519 --> 00:01:38.640
I think we invested in like 2017, 2018, and you've been working on this uh since 2015, but you hit a major, major inflection point.
00:01:38.879 --> 00:01:40.640
When would you say like was that a year and a half ago?
00:01:40.719 --> 00:01:41.519
Two years, is that right?
00:01:41.680 --> 00:01:43.120
Yeah, I would say about two years ago.
00:01:43.280 --> 00:01:45.120
And you just raised what was the latest round?
00:01:45.359 --> 00:01:53.280
Uh we raised a Series D round, and that closed in July of 2025, and it was 122 million USD.
00:01:53.519 --> 00:01:55.120
So let's start at the beginning, right?
00:01:55.200 --> 00:01:57.120
You started this in 2015.
00:01:57.280 --> 00:01:58.319
What were you doing at that time?
00:01:58.400 --> 00:02:00.640
And what did you see that made you start Blue Jay in the first place?
00:02:00.879 --> 00:02:03.760
You know, I I started out as a tax law professor.
00:02:03.840 --> 00:02:06.959
I'm still I'm still a tax law professor at the University of Toronto.
00:02:07.040 --> 00:02:22.000
But I I had gone to undergrad and studied economics, finance, philosophy, went into law school at the University of Toronto, did a graduate degree in economics concurrently with my law degree, and then went on to graduate studies in law at Yale Law School.
00:02:22.080 --> 00:02:27.120
And then I moved to Ottawa and I was a law clerk at the Supreme Court for a year.
00:02:27.280 --> 00:02:33.039
And immediately after that, I became a tenure-track law professor at the University of Toronto at age 26.
00:02:33.120 --> 00:02:34.240
I think maybe the youngest.
00:02:34.479 --> 00:02:35.039
How does that happen?
00:02:35.120 --> 00:02:36.000
Because that's not, yeah, not.
00:02:36.400 --> 00:02:40.000
No, maybe the youngest tenure-track law prof in the history of the law school.
00:02:40.240 --> 00:02:44.400
I think uh a whole bunch of things went really my way on that one.
00:02:44.560 --> 00:02:47.280
I I had an interesting research agenda.
00:02:47.520 --> 00:02:53.759
I had published some articles as a law student, and I was super keen on the academic path.
00:02:53.919 --> 00:03:00.319
I was really excited about the life of the mind and doing research and going deep on tax law.
00:03:00.400 --> 00:03:04.240
And so that's what I spent the ensuing decade really focused on.
00:03:04.400 --> 00:03:14.319
So from 2004 right through to when I started Blue Jay, I was full tilt into becoming the best possible tax law academic that I could be.
00:03:14.400 --> 00:03:17.039
And so I wrote dozens of academic articles.
00:03:17.199 --> 00:03:23.919
I co-authored several editions of the leading tax book used to teach tax in Canadian law schools.
00:03:24.000 --> 00:03:26.319
Um, the book's called Canadian Income Tax Law.
00:03:26.479 --> 00:03:32.080
And I co-authored with the lead author David Duff, the second, third, fourth, fifth, and sixth editions of that book.
00:03:32.319 --> 00:03:32.800
Did you like it?
00:03:32.879 --> 00:03:34.879
Like, was it everything you kind of expected it to be?
00:03:34.960 --> 00:03:36.080
The professor life?
00:03:36.400 --> 00:03:38.560
The professor life is an amazing life.
00:03:38.719 --> 00:03:41.759
It's uh you get to research things that you're interested in.
00:03:41.919 --> 00:03:47.520
The students are evergreen and very, very talented and very ambitious.
00:03:47.759 --> 00:03:52.879
And the classroom discussion is always interesting and engaging and challenging.
00:03:53.120 --> 00:03:58.879
And if you approach law teaching with the right attitude, you learn as much from your students as as they learn from you.
00:03:59.039 --> 00:04:03.120
And so, yes, being a law professor is is an amazing privilege.
00:04:03.280 --> 00:04:04.319
I really loved it.
00:04:04.400 --> 00:04:05.439
And I still do love it.
00:04:05.599 --> 00:04:09.840
I may return to it post-Blue Jay life on a full-time basis.
00:04:10.000 --> 00:04:21.439
But it was it was in the context of actually becoming associate dean of the law school and leading a reform of the curriculum that got me thinking about AI in the first place.
00:04:21.600 --> 00:04:23.040
And so I petitioned the dean.
00:04:23.120 --> 00:04:37.199
I said, if I'm going to become associate dean, uh, one of the things I'd like to do is to lead reform of the JD curriculum because I had just been through the law school curriculum from 1999 to 2002 when I was a law student.
00:04:37.360 --> 00:04:41.680
And I felt like it could be done in a different and probably a more effective way.
00:04:41.920 --> 00:04:42.879
Was it just like outdated?
00:04:42.959 --> 00:04:44.720
Like, I mean, a lot of education I think is outdated.
00:04:44.800 --> 00:04:52.959
Was that kind of just the the general one of the characteristic things about the curriculum at the time was it was evaluated through virtually entirely 100% final examination.
00:04:53.040 --> 00:05:00.639
So you would start the first year of law school in September, and you would have virtually no evaluations that counted for anything until April.
00:05:00.720 --> 00:05:06.959
And then you'd write you'd set a series of six or seven final exams, each of which would be worth 100% of the final grade.
00:05:07.040 --> 00:05:07.279
Wow.
00:05:07.439 --> 00:05:25.680
And so you wouldn't really know how well you were doing until you got your grades sometime in the summer following your first year of law school, which seemed like not the best way to allow people to learn and iterate as to their learning approaches and really figure out how well they are learning the law.
00:05:25.920 --> 00:05:31.439
And so it was as a result of co-leading this reform that I started thinking about AI.
00:05:31.519 --> 00:05:33.360
And it happened pretty innocently.
00:05:33.439 --> 00:05:39.839
I I asked one of my colleagues um what they thought about the curriculum reform, and they said, Ben, you really shouldn't do this.
00:05:40.079 --> 00:05:41.360
I said, Oh, why?
00:05:41.519 --> 00:05:57.680
They said, Well, the last time we had a major reform of the law school curriculum was in the early 70s, 1970s, and it led to some hard feelings among some faculty who never really subsequently got along after that effort as well as they did before the effort.
00:05:57.839 --> 00:06:02.560
They said it can just people feel strongly about the curriculum and how they approach it.
00:06:02.639 --> 00:06:07.279
And so it's a bit of a minefield, and you'd be better off just focusing on your scholarship.
00:06:07.439 --> 00:06:14.720
And I didn't take the lesson from that that I think that you know, colleague who is hoping to be helpful was wanting to deliver, which is like just kind of back off.
00:06:14.800 --> 00:06:16.560
Instead, I thought, oh, that's really interesting.
00:06:16.800 --> 00:06:20.160
It's been 40 years since somebody tried to do this.
00:06:20.319 --> 00:06:23.920
And I thought, okay, for one thing, it's probably far overdue.
00:06:24.000 --> 00:06:29.519
We should do it again and move to change the curriculum, how we're approaching teaching the law.
00:06:29.680 --> 00:06:43.680
And the second thing I took away from it was this observation, which was, well, if it took 40 years for someone like me to show up and suggest that we should really do a deep dive into changing the curriculum, it may be another 40 years before someone like me shows up and wants to do it again.
00:06:43.759 --> 00:06:48.560
And if if that's true, and I had this thought, if that's true, then what does that mean?
00:06:48.720 --> 00:06:57.040
And I I immediately started thinking about well, AI and machine learning and like just trying to go to the future, like not the present, but kind of like what's coming next.
00:06:57.199 --> 00:07:01.759
Yeah, like let's imagine what's what is kind of the societal backdrop 40 years from now.
00:07:01.920 --> 00:07:04.959
So at that point, it would have been like thinking about the year 2050.
00:07:05.120 --> 00:07:14.000
What's probably going to be true in the year 2050 that we should be taking into account if we're going to rebuild this curriculum, what should we be planning for to make it robust to?
00:07:14.079 --> 00:07:25.519
And I was like, well, Moore's law has been pretty robust, and you know, computing power is doubling um every couple of years, and its affordability is increasing, it, you know, improving exponentially as well.
00:07:25.600 --> 00:07:27.519
And it's like, okay, so that's coming.
00:07:27.600 --> 00:07:36.240
And then I thought about the fact that like legal information is even at that time was virtually all digital, and algorithms were improving significantly.
00:07:36.319 --> 00:07:43.040
So I was aware of some of the work that Jeff Hinton and other colleagues were doing in the computer science department at the University of Toronto.
00:07:43.120 --> 00:07:49.839
And I kind of felt like, oh man, there is a freight train with AI on the front of it coming directly for the law.
00:07:50.000 --> 00:08:03.040
And thought, wow, this has big implications for this curriculum reform, but also for you know the legal system generally, for the judiciary, and also selfishly and closer to home, like for me personally.
00:08:03.199 --> 00:08:16.000
And so I started to think about okay, what if I'm standing at the front of a law school classroom in the year 2040, thinking, oh man, I saw this coming in 2012, 2013, 2014.
00:08:16.240 --> 00:08:19.439
And I decided not to change my professional trajectory.
00:08:19.519 --> 00:08:22.079
And I just decided to stay on the same path.
00:08:22.160 --> 00:08:28.240
And I started to think I would probably feel some profound regret about that if I weren't to do anything about it.
00:08:28.319 --> 00:08:31.199
And then I thought, well, what is it that I should really do about it?
00:08:31.360 --> 00:08:37.279
I mean, okay, changing the curriculum is one thing, and we're working on that and introduce the curriculum reform.
00:08:37.360 --> 00:08:39.279
And I think it's been successful.
00:08:39.440 --> 00:08:44.399
You know, we have another more than a decade of experience with the reform curriculum, and I think it's been successful.
00:08:44.559 --> 00:08:48.080
But I started thinking, okay, so what does this mean for tax law?
00:08:48.240 --> 00:08:52.639
And I'd been through this exercise of revising several editions of this tax book.
00:08:52.799 --> 00:08:59.600
It's like a 1500-page introductory text on Canadian income tax law, and it's such a manual job to update it.
00:08:59.679 --> 00:09:07.440
There are new cases coming down from the tax court, from the federal court all the time, the Supreme Court, the legislation is being frequently amended by Parliament.
00:09:07.600 --> 00:09:09.440
Things are always changing in tax law.
00:09:09.519 --> 00:09:14.480
And so you have to go through and you have to laboriously research each point to see what has happened.
00:09:14.720 --> 00:09:22.320
You and it's always true that no sooner is that new edition of the book published than the law has changed and it's already out of date.
00:09:23.120 --> 00:09:25.679
I'm like, there's gotta be a better way to do this.
00:09:25.840 --> 00:09:32.399
I was like, okay, well, so clearly there's a role for AI in improving tax research.
00:09:32.639 --> 00:09:34.480
Okay, so what does that mean for me?
00:09:34.559 --> 00:09:37.200
And I was like, well, somebody's going to improve us.
00:09:37.519 --> 00:09:39.360
I started to think, well, who's that gonna be?
00:09:39.519 --> 00:09:41.519
I started to wonder, well, might it be me?
00:09:41.840 --> 00:09:42.320
Surely not.
00:09:42.480 --> 00:09:45.519
Like, surely it's not a law professor who's gonna be leading this effort.
00:09:45.679 --> 00:09:47.279
Like, surely it's gonna be somebody else.
00:09:47.360 --> 00:09:49.840
But I thought about the different classes of people that it might be.
00:09:49.919 --> 00:10:03.360
It might be a tax law partner in a big firm, but that's so extraordinarily unlikely because they already have clients that they're servicing, they're part of a partnership, they're kind of bought in, they kind of have dedicated their lives to that path.
00:10:03.519 --> 00:10:10.879
It could be somebody who's doing tax at an accounting firm, but it's a very similar, it's a professional services role and they're on that path.
00:10:11.039 --> 00:10:15.120
They have clients, um, they're they enjoy like a relatively good living.
00:10:15.279 --> 00:10:20.879
And what draws somebody into that kind of career path is it's a known path and it's relatively predictable.
00:10:20.960 --> 00:10:25.679
And they are not dispositionally typically going to be the kinds of people who are like, oh, I know what I'm gonna do.
00:10:25.759 --> 00:10:32.720
I'm gonna do a tech startup and kind of roll the dice and see if they can make it work because the opportunity cost of doing that would be so significant.
00:10:32.879 --> 00:10:42.480
And so I thought, well, okay, it's not gonna be somebody with a tax law background, might it be somebody who's more kind of pure tech, like an AI or machine learning background?
00:10:42.639 --> 00:10:48.639
And again, I thought, well, probably not, because they would need to know a lot more tax law than than they do.
00:10:48.799 --> 00:10:53.039
It would take them a decade to catch up to where I was in my understanding of tax law.
00:10:53.120 --> 00:11:00.639
And probably they're not just not all that interested in learning tax law, and they wouldn't really understand the pain of tax research, right?
00:11:00.720 --> 00:11:06.720
Like they'd probably just prefer to go work for one of the FANG companies or do a startup in in some other area.
00:11:06.879 --> 00:11:09.679
And so it's probably not going to be one of those folks.
00:11:09.840 --> 00:11:12.639
It's like, well, maybe, maybe it is, maybe it could be me.
00:11:12.799 --> 00:11:21.120
And so talking to a number of my colleagues at the law school and a couple really impressive people, Anthony Nibblett and Albert Eun.
00:11:21.279 --> 00:11:24.480
Anthony Nibblett has business and law degrees from the University of Melbourne.
00:11:24.559 --> 00:11:34.080
He grew up in Australia, a PhD in economics from Harvard, and taught at the University of Chicago Law School for a number of years before joining the faculty at the University of Toronto.
00:11:34.240 --> 00:11:45.519
And Albert Eun, who grew up in upstate New York, went to Yale College and then went to Stanford, did his law degree at Stanford, his PhD at Stanford, and then taught at Northwestern for several years before joining the faculty.
00:11:45.679 --> 00:11:49.120
And you know, we'd frequently have lunch together and talk about ideas.
00:11:49.200 --> 00:11:56.399
And said, guys, like this is a strange idea, but I think we should start a tech company to leverage AI to do tax research.
00:11:56.559 --> 00:11:59.679
Both of them were like, okay, Ben, we're on board, but you have to take the lead.
00:11:59.840 --> 00:12:01.440
I'm like, okay, that's fine.
00:12:01.519 --> 00:12:03.840
Uh, you guys are really impressive, brilliant.
00:12:04.000 --> 00:12:12.639
I can't imagine having better co-founders in terms of intellectual capabilities and being really great people to spend a lot of time with.
00:12:12.879 --> 00:12:23.039
So we started Blue Jay, and it happened to coincide with the end of my term as associate dean, which was great because I was entitled to a year of administrative leave, essentially a sabbatical after that.
00:12:23.200 --> 00:12:28.559
And so I could like just throw myself into this for a year on a risk-free basis, kind of Pablo.
00:12:28.720 --> 00:12:32.080
So I was like, this is great because I don't have a huge opportunity cost.
00:12:32.159 --> 00:12:37.200
I had this time that I can dedicate to doing whatever it takes to get this off the ground.
00:12:37.440 --> 00:12:44.080
Did you know what you wanted to build, or you just knew it was AI and tax law and you just had to explore the space to figure out what you might build?
00:12:44.320 --> 00:12:48.399
What I wanted to build from the outset is what we've now eventually been able to produce.
00:12:48.639 --> 00:12:49.759
Ten years later, yeah.
00:12:50.000 --> 00:12:51.360
Yeah, and so it took 10 years.
00:12:51.519 --> 00:12:57.919
But what I originally wanted to build out of the gate is a system that could automate all of tax law research.
00:12:58.080 --> 00:13:09.360
So you could just type any arbitrary tax research question into Blue Jay, and Blue Jay would just produce, you know, a world-class answer leveraging primary sources of law.
00:13:09.440 --> 00:13:21.519
So like legislation and case law and and maybe commentary, academic commentary, professional commentary on the topic, and and give you essentially a full legal memo, a takedown of that particular question.
00:13:21.679 --> 00:13:25.759
That was utterly science fiction in 2015 when we started.
00:13:26.000 --> 00:13:28.960
So we had to start with what we could actually get to work reliably.
00:13:29.039 --> 00:13:38.000
And what it what it turned out we could get to work reliably was supervised machine learning models that could predict how courts would resolve certain kinds of tax questions.
00:13:38.159 --> 00:13:41.919
And so there are a number of these examples throughout income tax law.
00:13:42.159 --> 00:13:47.919
One example is whether a worker is an independent contractor or an employee for tax purposes.
00:13:48.080 --> 00:13:50.240
That's like a nice conventional example.
00:13:50.320 --> 00:14:02.799
And there have been hundreds and hundreds of these cases decided by the courts over the years, and it's really tricky because you could have an agreement that says you are an independent contractor and there's not an employer-employee relationship here.
00:14:02.960 --> 00:14:22.879
And the courts will ignore that agreement and say, actually, for tax purposes, this worker really is an employee, because they'll look at the functional relationship between the hirer and the worker and say, no, there's too much control exerted over the worker's work by the hirer here for us to ignore that and say that this person really is an independent contractor.
00:14:22.960 --> 00:14:45.759
And so they'll reclassify that relationship, and that has big consequences because then the employer, the deemed employer, should have been withholding income tax at source, should have been withholding EI and CPP contributions, payroll taxes on that worker's earnings, should be paying the CPP and EI matching contributions, the employer matching contributions.
00:14:45.840 --> 00:14:52.240
So there are like really draconian tax consequences that flow from that reclassification of that relationship.
00:14:52.480 --> 00:14:53.200
That's one example.
00:14:53.360 --> 00:15:01.360
So we built a bunch of these different predictive models, which performed extremely well, like better than we thought they could perform.
00:15:01.440 --> 00:15:06.480
And so we're predicting with better than 90% accuracy how courts would resolve these kinds of cases.
00:15:06.720 --> 00:15:11.120
And you could do this across countries, or how did you have to focus in on specific states?
00:15:11.440 --> 00:15:12.720
We started in Canada.
00:15:12.879 --> 00:15:15.039
So it's Canadian federal income tax law.
00:15:15.120 --> 00:15:17.200
So it's essentially the same across the country.
00:15:17.279 --> 00:15:21.360
There are some complications with Quebec, which uses the civil law rather than the common law.
00:15:21.440 --> 00:15:26.559
But it turns out that those cases are just as predictable as they are in the common law provinces as well.
00:15:26.720 --> 00:15:28.399
And so we were off to the races.
00:15:28.559 --> 00:15:32.399
But we learned, and and I think this is key to the topic of the show.
00:15:32.559 --> 00:15:36.080
What we learned was there was partial product market fit.
00:15:36.159 --> 00:15:55.840
The models that we built were actually functionally very useful when you had one of those kinds of cases that you were analyzing, either as a tax lawyer or as a tax accountant, or even in government, those models were very good because they would give you directionally a really good sense of how to handicap your odds if something were to go into litigation or be challenged by the government.
00:15:56.080 --> 00:15:57.759
Who are the customers, by the way, just to set context?
00:15:57.840 --> 00:16:00.559
Like you're selling this to governments or are you selling this to law firms?
00:16:00.799 --> 00:16:04.080
Yeah, law firms, uh accounting firms, and in government.
00:16:04.240 --> 00:16:05.919
Had some conversations with corporations.
00:16:06.080 --> 00:16:11.840
That was a tougher sell for the tax product because they just really didn't tend to have the same frequency of tax work.
00:16:12.000 --> 00:16:19.039
The biggest limitation of V1 of Blue Jay is that we only had like these issue by issue models.
00:16:19.200 --> 00:16:27.519
And so they were really good at what they did, but it didn't satisfy the original vision, which was something that could handle any tax research question that you threw at this thing.
00:16:27.679 --> 00:16:28.879
And so we grew it.
00:16:28.960 --> 00:16:29.840
It was encouraging.
00:16:30.000 --> 00:16:31.360
Like the progress was encouraging.
00:16:31.519 --> 00:16:33.679
This is why um astral invested.
00:16:33.840 --> 00:16:38.799
It's why we were able to raise a pre-seed round, a seed round, a series A, a series B.
00:16:39.039 --> 00:16:47.120
Walk me through, because this is really important from a position of not having full product market fit, which is you got to a few million in ARR at a good pace.
00:16:47.200 --> 00:16:55.679
Like as a founder that never done this before, you would think, okay, if I can get a few million ARR with no resources, then for sure if I raise more and I've got more stuff going, I'll get to the next phase.
00:16:55.759 --> 00:16:56.799
Like that's gotta be easier.
00:16:56.879 --> 00:16:58.480
And sometimes, like, it's not, right?
00:16:58.559 --> 00:17:00.080
And it kind of happened to you for a while there.
00:17:00.159 --> 00:17:02.720
Like, just describe maybe why that is.
00:17:03.039 --> 00:17:13.839
Technically, what we had built was world class, and we could demonstrate it and people could test it and they would trial it and they'd be like, Yeah, this is this technology does what Blue Jay is promising it can do.
00:17:13.920 --> 00:17:18.559
It's actually predicting how these cases are gonna go, and it's fantastic at that.
00:17:18.720 --> 00:17:26.720
What was less appreciated is the behavioral change necessary to adopt that and to actually get the value out of the platform.
00:17:26.880 --> 00:17:34.880
Um, and so one of the big problems with not having a solution that could answer any tax research question is the following would happen.
00:17:34.960 --> 00:17:42.319
And this is a stylized version of what would happen, but people would use it for something that Blue Jay does and they get excited, they're like, oh, this is really great.
00:17:42.480 --> 00:17:46.079
And then that issue would come up again, they'd log in, they'd be like, okay, this is great.
00:17:46.160 --> 00:17:46.880
And they would use it.
00:17:46.960 --> 00:17:53.759
And then they would encounter a slightly different tax issue, and they would come in and we wouldn't have a model for that problem.
00:17:53.920 --> 00:17:55.519
And they'd go, Oh, well, that's too bad.
00:17:55.599 --> 00:17:58.880
I was really hoping Blue Jay would have something on this, and and they don't.
00:17:58.960 --> 00:18:03.359
And so they and then they might try it again and they might have another disappointing experience.
00:18:03.440 --> 00:18:09.680
And then, like, okay, okay, this thing just doesn't cover enough of what I wanted to, and then they forget about they'd never log in again.
00:18:09.839 --> 00:18:12.799
And so the usage was really hard to sustain.
00:18:12.960 --> 00:18:16.000
There were some power users who knew exactly what Blue Jay did.
00:18:16.160 --> 00:18:26.960
They would come in all the time with reasonable frequency and get a ton of value out of the platform, but it wasn't enough of a consistent experience for all the users to really get this thing to take off.
00:18:27.119 --> 00:18:32.400
So, yeah, we got it up to over five million dollars in ARR over several years.
00:18:32.559 --> 00:18:39.680
Um, it was like enough to keep us encouraged, but we knew what the problem was, couldn't answer every tax research problem that people had.
00:18:39.839 --> 00:18:47.680
And so when it was actually prior to Chat GPT coming out, it was earlier that fall, it was like September.
00:18:47.839 --> 00:18:56.000
And I remember the DaVinci 3 version of GPT 3 was released, and I was in the OpenAI playground playing around with it.
00:18:56.079 --> 00:18:57.920
I'm like, oh, this is surprisingly good.
00:18:58.079 --> 00:19:03.920
If you go back into the LinkedIn archives, you can probably find some of the posts that I was making at that time, going, This is really interesting.
00:19:04.000 --> 00:19:06.640
Like, check out what DaVinci 3 is able to do.
00:19:06.799 --> 00:19:23.119
That triggered this thought, okay, maybe there's something with these new large language models that are coming out that we can harness in order to get to this seemingly magical outcome, this magical user experience, which is we can if we can answer any tax research question somebody has, that would be like unbelievably good.
00:19:23.279 --> 00:19:35.279
It was the original vision that we had for the company was like, let's build this thing that's tantamount to magic that could take all of the difficult, challenging work of tax research and really, really automate it.
00:19:35.359 --> 00:19:44.640
And so what was it, by the way, technologically, that in the old, like the supervised learning made you have to build model by model, case by case, but then with LLMs, you know, it could just in theory handle everything.
00:19:44.960 --> 00:19:48.960
The big move was retrieval augmented generation, where you kind of run a search.
00:19:49.039 --> 00:19:53.680
So you have a master corpus of all of the relevant tax research materials.
00:19:53.839 --> 00:20:01.119
You run an intelligent search and you've got like really good vector embeddings and you're chunking all of the stuff and indexing it in just the right way.
00:20:01.359 --> 00:20:12.079
You can find the relevant research resources and then you can synthesize them and generate a very good synthesis of all of the relevant materials and produce an answer.
00:20:12.319 --> 00:20:29.279
But you just could not do that reliably prior to the large language models having sufficient natural language understanding, synthesis capabilities, the token uh limits, especially early on, even with this new approach, were challenging to work within.