Benji Encz, CEO and Co-founder of Ashby, discusses the background and development of the all-in-one recruiting platform. He explains that the inspiration for Ashby came from his experience as a director of engineering, where he faced pain points with existing tools in the space, particularly around data and reporting, as well as scheduling coordination. Benji saw an opportunity to build a new platform from scratch that addressed these issues and incorporated modern recruiting operations, data analytics, and automation. While initially targeting early-stage companies, Ashby is now expanding to serve enterprise-level customers. Bob Pulver and BenjI discuss the use of automation and AI in the talent acquisition process. They explore the benefits of automation in scheduling interviews and coordinating the hiring process. They also discuss the importance of setting realistic expectations in product development and the value of delivering features iteratively. BenjI shares his thoughts on the potential of AI in areas such as candidate sourcing and matching. They also touch on the need for individuals to embrace learning and adapt to new technologies in order to elevate their AIQ.
Keywords
recruiting platform, Ashby, software engineering, data and reporting, scheduling coordination, recruiting operations, data analytics, automation, ATS, CRM, sourcing, AI capabilities, decision-making, fairness, quality of hire, interview intelligence, scheduling efficiency, automation, AI, talent acquisition, scheduling, product development, expectations, candidate sourcing, candidate matching, learning, AIQ
Takeaways
- Ashby was developed to address pain points in the recruiting space, particularly around data and reporting, as well as scheduling coordination.
- The platform incorporates modern recruiting operations, data analytics, automation, and features of an ATS, CRM, and sourcing tool.
- AI capabilities are being incorporated thoughtfully, focusing on areas such as outbound personalization, advanced candidate search, and resume review assistance.
- The goal is to improve the efficiency and fairness of the recruiting process, providing a better experience for both recruiters and candidates.
- Automation and AI can greatly improve the efficiency and experience of the talent acquisition process.
- Setting realistic expectations in product development is crucial, and delivering features iteratively can help avoid missed timelines and commitments.
- AI has the potential to revolutionize candidate sourcing and matching, allowing for more targeted and efficient talent acquisition.
- Individuals need to embrace learning and be willing to adapt to new technologies in order to stay competitive in the future of work.
Sound Bites
- "Recruiting operations was becoming a thing. Teams were working more with data. People were buying more tools."
- "We started with outbound personalization, which LLMs are generally really good at."
- "We started with our kind of advanced candidate search, packaging that in natural language."
- "What it took to get there is a lot of logic and steps and data to look at and coalesce."
- "We've stayed in this more natural state of kind of evolving the product more iteratively."
Chapters
00:00 Introduction and Background
08:10 Incorporating Outbound Personalization
14:01 Improving Fairness in the Recruiting Process
23:27 Adoption of Scheduling Technology
28:14 Setting Realistic Expectations in Product Development
35:29 The Potential of AI in Candidate Sourcing and Matching
46:48 Elevating Your AIQ: Embracing Learning and Adapting to New Technologies
Benji Encz: https://www.linkedin.com/in/benjaminencz
Ashby: http://www.ashbyhq.com
For advisory work and podcast sponsorship inquiries:
Bob Pulver: https://linkedin.com/in/bobpulver
Elevate Your AIQ: https://elevateyouraiq.com
Powered by the WRKdefined Podcast Network.
[00:00:09] Hey, it's Bob. Thanks for joining me on another episode of Elevate Your AIQ. Today I'm joined by Benji Encz, CEO and co-founder of Ashby, a modern and comprehensive recruiting platform, which I happen to use myself while still in stealth mode when it was mostly an applicant tracking system, ATS. Benji and I dive deep into the development of Ashby, discussing how it addresses common pain points in recruiting, particularly around data reporting and scheduling. Benji shares insights on incorporating AI capabilities thoughtfully into the platform, and we explore the potential of the AIQ.
[00:00:39] Benji Enczema, CEO and CEO of Ashby, and revolutionizing candidate sourcing and matching. We also touched on the importance of setting realistic expectations in product development, the need for individuals to adapt to new technologies to stay competitive in the future of work. I have no affiliation with Ashby, but I must give kudos to Benji and Ashby team for doing an amazing job of listening closely to their users and consistently pushing out new features. I see him posting updates on LinkedIn all the time. Hope you enjoyed my discussion with Benji and thanks for tuning in.
[00:01:07] Hello, everyone. Welcome to another episode of Elevate Your AIQ. This is your host, Bob Pulver. With me today is Benji Encz. Benji, welcome.
[00:01:21] Thanks for having me, Bob.
[00:01:22] Thank you for being here. Benji, why don't you just give the audience just a quick overview of your background and how you got to the point where you decided to start Ashby?
[00:01:32] Yeah, I'll try to keep it reasonably short. But my background before starting a business is all in software engineering. And in my most recent role before starting Ashby, I, over time, became a director of engineering.
[00:01:43] Then pretty much all of my time hiring engineers and ended up having quite a few pain points with existing tools in the space. And my biggest issues were around data and reporting. That was kind of number one. The second was scheduling coordination, which just took a lot of our team's time.
[00:01:58] But taking kind of a step back, it felt like, you know, recruiting has changed quite a bit. Recruiting operations was becoming a thing. Teams were working more with data, people were buying more tools, and just felt like there's actually a really good opportunity to build kind of a new platform from scratch.
[00:02:11] We had an insight back in 2018 and then spent about two years building kind of the right foundation before starting to launch to some early customers. And then generally availability became like end of 2022. So it's been, you know, almost five years now at this point.
[00:02:27] Part of the insight was building kind of a new version of ATS with kind of all the things built in that we think modern teams would need. And it kind of starts with really strong reporting and analytics at a foundation.
[00:02:40] Good automation for like a rec ops team, you know, making things really customizable. And then in terms of the pillars, the traditional ATS functionality, but also candid CRM and sourcing, as well as scheduling automation, having all of that in a single product.
[00:02:54] That's kind of what we've built over the last few years.
[00:02:56] Yeah, I remember, I think my first exposure to Ashby, I was trying to drive tech strategy at an RPO firm, and they happen to be one of your early customers. And I was not only impressed with where it was, but I could tell the potential that it had.
[00:03:14] And just a caveat that, like, I didn't have an HR, you know, background, but certainly, you know, my time at IBM and, you know, in the tech space for most of my career, just when I think about user experience, and just logical, you know, flows of information and where to find things or whatever, it just seemed, you know, compared to some of the other athletic tracking systems, at least that I had seen.
[00:03:35] And I was like, oh, this is much easier on the eyes, first of all. And just, you know, like you said, when you think about sort of modern, you know, workflows and things like that, it's just so much less clunky than a lot of the other systems that I had seen.
[00:03:49] I mean, when you looked at the market, and you saw the need, I mean, were you looking at small to midsize, you know, companies? Or were you looking to just take on like the work days and the brass rings of the world?
[00:04:01] Yeah. You know, back when I looked at space first, I was a company that I grew into like, from about 100 to 400 or 500, and like two to three years. So definitely not huge, but obviously pretty, you know, significant headcount growth.
[00:04:15] So that was kind of the, what we considered the ICP is like companies that are investing a lot in talent are growing quickly as a percentage basis. So we looked a lot on adventure by companies, we did end up talking to a lot of later stage and public companies as well.
[00:04:31] And we did actually find that there's a pretty good opportunity. When you kind of looked at a, you know, more bottom of the market, there was like lever and greenhouse was like, up until like low enterprise, high mid market. But then there was a pretty big gap from there, where people like, hey, we're kind of outgrowing greenhouse, but we don't really know where to go next. We don't want to go and necessarily to like an ICMS because it's a lot.
[00:04:53] So we did think that there's a really interesting opportunity there. And we made sure to build a foundation of the product to be able to serve enterprise in the longterm.
[00:05:00] I think some of the early, early decisions you make about how to architect the product, make a pretty big impact on your ability to move up market over time.
[00:05:08] But when we launched, there was a lot of like top level functionality that was not there yet for these customers.
[00:05:14] So that's why I started with earlier stage companies, but we are today starting to serve enterprise and we do see a lot of that early investment that we made pay off.
[00:05:23] It's still a bit earlier, but we do have a number of customers on the, you know, 10,000 person range that are getting a lot of value out of the product.
[00:05:31] So definitely the longer term mission is to play there as well.
[00:05:34] So as you sort of evolved the platform and you added some of like the CRM kinds of capabilities, I mean, sure, you saw this from your prior experiences, but again, I don't, I'm not coming this from a, we had it from an HR perspective.
[00:05:49] I just immediately saw, you know, back in at that RPO, like the unnecessary sort of disconnect between ATS and the CRM and everyone struggling with candidate discovery.
[00:06:03] And, you know, it's not wonder because, you know, they weren't necessarily data savvy and it just seems so illogical that those two things were separate.
[00:06:11] And I haven't gotten a good story of why those two things were separate from the beginning, but it sounds like you kind of knew you recognize that, you know, as you went into that space and then sort of took on some of the more traditional CRM candidate relationship management system.
[00:06:28] Yeah.
[00:06:29] Yeah.
[00:06:29] Part of it is I think how the market evolves.
[00:06:32] So even the latest generation of kind of ATS products in the space prior to us were all started in a cluster around like 2011 to 2013 or so.
[00:06:44] And recruiting teams changed quite a bit of how they work after that, you know, in the subsequent years.
[00:06:50] Again, recruiting operations being one part, but a much stronger emphasis on sourcing and building out in-house sourcing teams was also a thing that happened largely after that generation of products got started.
[00:07:00] And so a little bit of benefit of hindsight and just seeing like, what does market look like today?
[00:07:04] And if you were to build something from scratch, then it would actually make sense to incorporate all that in a single product.
[00:07:10] I think.
[00:07:11] And the other part that we just noticed in the market is that that gap between how TA teams were working and the prior generation of tools was growing quite quickly.
[00:07:19] And so that just led to a lot of point solutions kind of filling some of these gaps that, you know, were quite painful for recruiting teams.
[00:07:25] When you think about, you know, incorporating AI capabilities, and I know, you know, even early on, you've got some scoring in there, but it wasn't necessarily, you know, AI powered.
[00:07:37] But when you think about, I guess part of this is the shiny objects, you know, syndrome on behalf of, you know, not just the customers, but, you know, maybe your competitors.
[00:07:48] But as you think about evolving the platform, how do you think about incorporating like AI, you know, capabilities and doing it in a thoughtful way, not a, hey, look at me.
[00:08:00] Yeah.
[00:08:01] I've got AI too kind of way.
[00:08:03] Yeah.
[00:08:04] That's always an interesting balance.
[00:08:05] I think for us, you know, I was one, just like personally quite excited when I got access to kind of the first modern LLMs.
[00:08:14] It was a little bit, a little bit similar to like programming for the first time.
[00:08:17] And that's like, wow, this is a pretty magnificent tool.
[00:08:20] But the first thing we did internally was basically prototype a bunch of things to get a sense of like, what are these tools actually good at today?
[00:08:28] And we kind of stack ranked kind of the AI roadmap a little bit around what are LMs really good at today?
[00:08:35] What are some of kind of potentially lower effort, higher value things we can do first?
[00:08:39] And we started there to kind of get some repetitions with technology and all of that.
[00:08:45] So we started with like outbound personalization, which, you know, generating content LMs are generally really good at.
[00:08:52] So that was kind of the first thing we did.
[00:08:53] The other thing that was really interesting to us is we have a bunch of areas in our product that are quite advanced.
[00:08:58] So there's a little bit of a learning curve if you want to use all the functionality.
[00:09:03] And for us, it was quite interesting to see, can we, you know, take some of the learning curve away by using AI?
[00:09:09] And so the second thing we should be like our kind of advanced candidate search packaging that in kind of natural language.
[00:09:15] So you can say a thing like, hey, show me all engineers we interviewed more than two weeks ago that reject our offers and scored really well on this interview, whatever.
[00:09:24] And what is cool about the way we approach it is like we take natural language, but actually build like a structured query that a user can look at.
[00:09:30] It's built exactly the same way a user would.
[00:09:32] And so these things are really interesting.
[00:09:33] We started there and now we're getting to the slightly more ambitious things where, you know, very much still in testing mode, but we are looking at kind of inbound.
[00:09:41] What can we do there around assisting in resume review?
[00:09:45] It's not where we started because we knew that's like a more nuanced and complicated area.
[00:09:50] So the generative AI piece, I can definitely see where that would come in handy for a recruiter or even a sourcer who just, you know, forget about, you know, have we moved past, you know, teaching people how to do Boolean, you know, queries when you can get the same results by asking a natural language.
[00:10:09] Yeah.
[00:10:09] You know, question.
[00:10:10] And you're right.
[00:10:11] You know, if you can streamline some of the outreach and engagement, you know, within certain parameters, I suppose, and just keeping tabs on appropriateness of the messages and stuff like that.
[00:10:23] But even more broadly, because what I think about, like, you know, legislation when it comes to, you know, like responsible AI and mitigating bias and things like that.
[00:10:34] You know, people often cite that, well, you know, ATS has been around for a while and they've been doing some type of, you know, most of them do some type of scoring or stack ranking or grading of candidates.
[00:10:44] And is that the same thing?
[00:10:46] I mean, from an audit perspective, it kind of gets all lumped together, right?
[00:10:52] Any algorithm or autonomous, you know, system or, you know, doesn't have to be officially, you know, AI for it to fall under like New York City, you know, legislation, for example.
[00:11:04] But at the same time, you know, there's other areas where I guess we need to look at it, you know, holistically anytime it has the potential to make a decision or to present recommendations, you know, to a human being where it's sort of guiding them down a certain path or whether or where the human may assume that it's giving you an answer like a calculator.
[00:11:29] Yeah.
[00:11:29] In the name of efficiency, you know, they sort of move on.
[00:11:32] So I was curious how you think about that.
[00:11:36] And just in terms of, you know, the fairness, you know, question.
[00:11:42] Yeah.
[00:11:42] Again, this is part of the reason that's not where we started and we don't have anything in the product today that is kind of assistive decision making, but it's something we are exploring.
[00:11:52] The things we focused on first were not around decision making.
[00:11:55] Again, you know, the things I just touched on then recently, things like feedback summarization that are just like helpful tools for things people are already doing.
[00:12:04] We're now, again, we're kind of ready to look at some of the assistive things.
[00:12:07] We've done some early tests internally and nothing that is rolled out to customers.
[00:12:11] But I don't think we, what we want these systems to do is like, you know, blanket recommendations or scoring.
[00:12:17] Like this person is a 99 out of 100 and kind of give a false sense of accuracy.
[00:12:22] But what we can do, I think, is have a pretty objective list of criteria, for example, that you want to review a set of candidates against and then have that tool assist in conducting that review.
[00:12:33] That's like, I think that these tools are really good at.
[00:12:37] And what we've also really focused on all the AI features so far is that they're explainable in nature so that we can kind of see how they reason through something.
[00:12:46] Again, if we think about the natural language search, you can actually see the output and you can correct it.
[00:12:49] So I am pretty optimistic.
[00:12:52] There's like a pretty good middle ground where the end result will probably actually be that the outcome is fairer treatment of candidates because a real limiting factor today is, especially in this market, the ability to, you know, bring attention to potentially like a thousand resumes in a row.
[00:13:12] Like a human will easily be fatigued trying to review these and will then resort to like shortcuts.
[00:13:18] Whereas a machine very happily will go through line by line reading everything.
[00:13:23] So I think there's really cool opportunities to, yeah, just improve the outcome probably for both sides of the marketplace here.
[00:13:31] That's where I was going.
[00:13:33] Like the candidates, like we're in kind of a strange situation, right?
[00:13:37] Like you already had a high candidate, you know, volume before people were doing the, you know, hitting the lazy apply, you know, button and blasting seemingly customized resumes, you know, in mass out to different jobs.
[00:13:52] That they may not even realize, you know, they may not even realize that they've applied to because the system suggested, you know, these jobs look like a good fit or whatever.
[00:14:01] I haven't used those yet, but it just seems like we're sort of perpetuating this, this like cat and mouse game.
[00:14:07] And if you didn't think you had enough resumes before where you had to rely on, you know, the match scores and the stack rank or whatever, because you had a hundred candidates and you didn't have time to interview a hundred candidates or whatever.
[00:14:21] Now you've got, you know, an order of magnitude more, you know, candidates.
[00:14:25] And it just puts more pressure on you based on current, you know, incentives and metrics in terms of efficiency and throughput.
[00:14:34] If you couldn't get through a hundred resumes, you're certainly not going to get through a thousand.
[00:14:38] And so you almost have no choice but to rely on, you know, machine intelligence to tell you who, you know, the top candidates are.
[00:14:47] But while you're doing that, you're also potentially injecting, you know, risk into the process.
[00:14:55] I probably agree.
[00:14:56] I think there are definitely still teams that are not, you know, actually, I don't know which ATSC are most familiar with, but most, the most that our customers.
[00:15:05] I want to take a break real quick just to let you know about a new show.
[00:15:09] We've just added to the network up next at work hosted by Gene and Kate Akil of the Devon Group.
[00:15:17] Fantastic show.
[00:15:18] If you're looking for something that pushes the norm, pushes the boundaries, has some really spirited conversations.
[00:15:26] Google up next at work, Gene and Kate Akil from the Devon Group.
[00:15:33] Use whatever we switch people from.
[00:15:35] Actually don't have any built-in scoring.
[00:15:38] I think it's a little bit more common for that to sit down like an add-on today.
[00:15:43] And so we didn't see it being used as widespread as some people believe.
[00:15:48] There is like other kinds of automation knockout questions that we also support where, you know, you can do basic things like, hey, are you willing to relocate?
[00:15:55] No.
[00:15:56] Okay.
[00:15:57] If just like a travel to a privacy location, you can auto-reject people, stuff like that.
[00:16:00] But the status quo today is still vastly, vast majority of BC is still human review.
[00:16:07] But yes, the volume is getting obviously pretty significant.
[00:16:10] So I, again, I'm tiny, like if this technology is built well, I'm a little bit less, a little bit less worried about introducing risk or bias at a process.
[00:16:20] I do think there's like relatively straightforward ways of doing it well.
[00:16:25] There are some inherent limitations, which we can also talk about of like, they're hard to solve.
[00:16:30] But like, you know, resume is obviously not a perfect representation of someone.
[00:16:33] So there's...
[00:16:34] Yeah, exactly.
[00:16:35] There's like some fundamental challenges with that.
[00:16:38] Yeah, I guess I still have a fundamental concern about the, you know, the job description, resume matching or whatever,
[00:16:44] because it seems like sometimes it's just part of the problem and not part of the solution as far as like improving quality of hire and really truly identifying someone's potential to succeed in the role.
[00:16:58] So I know you guys have, you know, you've done quite a bit of work on just tying to that quality of hire kind of concept, especially of late.
[00:17:08] There's also, you know, other, there's a lot of kind of non-AI solutions to this problem that we were thinking about.
[00:17:15] And also we use in our own hiring, recommend customers to use, but even as simple things as like some custom application questions, we rely quite heavily on that.
[00:17:26] We try to make the effort not super significant, but you ask someone to write like two or three paragraphs about something that's relevant to the role.
[00:17:33] You know, for example, for our engineering manager roles, we asked them to describe the best engineer they've worked with in the past.
[00:17:39] And so we get a sense of like introspection and how articulate are they and all these other things that you may not be able to get from resume.
[00:17:46] That's pretty fast to review.
[00:17:48] So I think there's a lot of ways you can give opportunity to candidates to express themselves a bit better.
[00:17:54] And then also give people that actually are really interested in the role over, again, just having a resume as like the only data point.
[00:18:01] Yeah, absolutely.
[00:18:02] So you've incorporated some level of like interview intelligence, I guess.
[00:18:07] It's actually much simpler than that.
[00:18:09] It's basically just, you know, questions you can add to your inbound, basically.
[00:18:12] It's more just like, and a lot of ATSs have that, but for our own hiring process, we leverage it, I'd say, more than some other teams that we've seen in the past.
[00:18:20] You mean for like a recruiter phone screen kind of thing?
[00:18:23] Even just the application itself.
[00:18:25] Okay.
[00:18:26] Oh, okay.
[00:18:27] Yeah.
[00:18:27] And for some roles, we've also experimented with like, you know, we don't have as many questions up front, but then the first step is like a quick survey with like two or three questions rather than like an initial call.
[00:18:36] But there's like different tools you can use to structure a process where you can assess a pretty high volume of candidates without talking to everyone and giving them a chance to, again, express interest in the role beyond just submitting a resume, which I think can be helpful.
[00:18:49] Yeah.
[00:18:50] I think expect we'll see some more changes there and like how people structure their process.
[00:18:55] Okay.
[00:18:56] On the scheduling side, it sounds like you, just based on when you told me about like the impetus for starting Ashby in the first place, like scheduling was always on your mind.
[00:19:07] And, you know, I looked at quite a few of these, these standalone solutions in my time at Talent Tech Labs, because it seemed like the value was so obvious that I was surprised I hadn't seen, you know, that kind of functionality before.
[00:19:22] But it sounds like you had already decided that, you know, this, this is a problem that I know how to solve.
[00:19:28] And there's no reason why this shouldn't be just baked into, you know, the core platform, excuse me.
[00:19:32] So you just like unpack that a little bit.
[00:19:36] Happy to speak to that.
[00:19:37] And I think again, it comes a little bit back to kind of what were the expectations of what recruiting looks like back in kind of 2012.
[00:19:44] I didn't think there was as much of a recruiting coordination function and role and the volume of interviews was per hire was like lower.
[00:19:53] But when I was working in my last role at Plain Grid, just like 100 to 400, 500 person company, you know, half the recruiting team, which was not, it was not very big, but half the recruiting team was basically spending all their time on coordination.
[00:20:05] And so you just look to that as like, that is like a thing that computers are really good at today.
[00:20:11] And there's really just a lot of opportunity to give people time back.
[00:20:14] So, you know, unlike automating decision making, which is really, really hard kind of automating scheduling, where it is like an objective right answer is like much easier.
[00:20:23] And so we did from day one look at kind of where can we bring efficiency to people using the product.
[00:20:31] And so the scheduling piece is not, you know, you could probably call it AI in the sense that it's like very clearly like machine intelligence in some form.
[00:20:38] It uses like constrained optimization and a bunch of other ML techniques.
[00:20:41] It's not like LLM AI.
[00:20:43] There was something we wanted to lean in pretty early.
[00:20:45] And so something built into the foundation of the product.
[00:20:48] Again, just the thought of like, hey, there's, what are all the jobs that a recruiting team needs to do?
[00:20:52] And which of these things can we really save time?
[00:20:54] And the interesting thing is that, as you mentioned, that technology has been around for a while as like kind of more of a point solution.
[00:21:02] I think the adoption of that has really taken off actually more in the last few years or so where now people have smaller teams and they're noting to get more done with these smaller teams.
[00:21:11] And now there's a little bit more appetite to kind of learn new technology and lean on technology.
[00:21:16] Whereas before there was a little bit more, more teams were just putting more bodies behind problems.
[00:21:20] But I think now people definitely focus on like, what can we actually solve with software and make more scalable?
[00:21:25] The integration had to be there at some point anyway.
[00:21:29] So it just seems pretty logical that it would be baked in.
[00:21:33] It's not a trivial thing to build, said that, is it?
[00:21:36] Yeah, no, I think there's a lot because you, I mean, you support like, I mean, this isn't just like a one-on-one meeting, right?
[00:21:41] It's like, you got to look at availability of across the hiring team and, you know, maybe load balancing.
[00:21:48] And preferences and custom hours and all sorts of stuff.
[00:21:52] And the other big part is like actually interviewer training, which is something I managed as an engineering leader.
[00:21:57] But especially on engineering side, there's often like a requirement to like shadow or overshadow a certain number of interviews before you get graduated.
[00:22:04] And kind of managing all of that, all of these kind of scheduling constraints together.
[00:22:08] Definitely took us a couple of years to get all that in place.
[00:22:11] Yeah, I didn't mean to make it sound simple.
[00:22:14] It's a, the end result seems so simple.
[00:22:17] Hey, look, we got a meeting on the calendar.
[00:22:19] But what it took to get there is a lot of, you know, logic and steps and data to look at and coalesce.
[00:22:27] So, yeah, you're right.
[00:22:28] I think it's, it takes the burden.
[00:22:31] And I mean, look, I'm not trying to have automation or AI cause, you know, job loss by, by any means.
[00:22:39] There's all kinds of things that people are capable of.
[00:22:42] If only you give them the opportunity to, to show, you know, what their other skills are.
[00:22:47] And so there's always other work to be done.
[00:22:50] And so, but I think one of the other big things about it is it improves the experience for everyone involved.
[00:23:00] I mean, for a candidate to just be able to quickly schedule, whether you're doing it through a conversational, you know, interface or a quick little, you know, select a time here, like a, like a calendly kind of thing.
[00:23:12] No matter how you do it, it's still better than the email ping pong, you know, trying to coordinate across, especially if you have a series of, of interviews to schedule or group interview.
[00:23:22] That's something not everyone necessarily agreed with in, in the past, but again, something that's become much more popular.
[00:23:28] I think first of remote because, you know, time zones became a big headache for people.
[00:23:32] And then now just like, they actually need to just be more efficient.
[00:23:35] Yeah.
[00:23:36] You know, this is a very, you know, sort of human, you know, relationship driven, you know, process and you want to feel like people are paying attention or whatever.
[00:23:44] So, so I can understand why there might've been some initial hesitation, but of all the places where you could take a human out of that, that loop, it seems like that's, it seems like a pretty logical, maybe it's 2020 hindsight, but it seems like a pretty logical place to kind of automate.
[00:24:00] Right. So you had a really, really interesting post the other day around how you're prioritizing the, the velocity of, you know, the product development versus trying to, you know, paint these rosy pictures about, you know, future, you know, capabilities.
[00:24:19] And, and I thought it was really a very logical way to think about it.
[00:24:27] Like, well, obviously I'll let you, let you explain it in more detail, but it just seems like, yeah, you're going to, you're going to promise that you're going to promise people or set, not promise, but at least set expectations.
[00:24:39] Even with the disclaimer slide that says, you know, these are forward looking, you know, things, right.
[00:24:43] Things could change. You still want, once it's there, once it's, you know, on the slide, once it's on the page, people are going to be like, oh, well, yeah, they're missing this feature, but they'll have it in six months.
[00:24:55] I mean, you know, what's the big deal. I'm not going to not go with them when they can just, you know, it'll be here before we know it.
[00:25:02] Right. And so, but just to think much more, much more logically through how you evolve the product and stuff, maybe you could go into a little bit of detail there and how your team may have, you know, sort of contributed to your thought process and what they think about that.
[00:25:21] Yeah. I'm happy to speak to that. And for folks that haven't, maybe don't have as much context, but I read a little kind of piece on our approach to kind of product roadmap.
[00:25:29] You know, I think in B2B, it's very common to have a longer term roadmap and it's often, there's kind of two reasons, I think that seem quite plausible, like good reasons to have these.
[00:25:40] One is, you know, you might expect a customer just kind of want to know like what's going to happen with the product. So that's a reasonable ask.
[00:25:46] And then for vendors, it can be helpful to potentially fill some gaps that they have in a product. Problems that we've seen in the past is that, you know, these timelines and commitments are often missed.
[00:25:59] So it just leads to kind of bad expectations. And it's actually, I've been a little bit surprised since starting Ashpeed of like how much people are still willing to believe these roadmaps.
[00:26:11] I would expect a little bit more skepticism, but we sometimes see that when we talk to someone, you know, researching them off to existing ATS and they're like, oh yeah, but they said they're going to build this the next three months.
[00:26:19] And then like, they don't actually have, the vendor has not shipped anything like six or nine or 12 minutes later. But the reason we are, so we basically said not to do that.
[00:26:29] When we talk to customers about, you know, what's on roadmap, et cetera, there is like a three month window typically at most.
[00:26:35] And that's kind of anything we talk about is stuff that we've, we've actively are working on and we kind of know the scope and we've committed to shipping it.
[00:26:43] I think part of that.
[00:26:44] Hi there, I'm Peter Zollman.
[00:26:46] I'm a co-host of the Inside Job Boards and Recruitment Marketplaces podcast.
[00:26:52] And I'm Steven Rothberg.
[00:26:53] And I guess that makes me the other co-host.
[00:26:55] Every other week we're joined by guests from the world's leading job sites.
[00:26:59] Together we analyze news about general niche and aggregator job board and recruitment marketplaces sites.
[00:27:05] Make sure you sign up and subscribe today.
[00:27:08] Hey.
[00:27:09] Is like a lot of companies actually start like that and then eventually give in to the like, hey, customers asking where are we going next?
[00:27:18] So let's start putting up a story.
[00:27:20] And so part of it is just like we've stayed in this more natural state of kind of evolving the product more iteratively.
[00:27:25] The other side is that especially the last few years we've seen recruiting, the recruiting landscape change really drastically like every couple of months almost.
[00:27:35] And so it's really influenced like what is important to people.
[00:27:38] And we've benefited a lot from being able to have a pretty blank slate and be like, oh, you know, there's this new regulation around salary transparency.
[00:27:47] Let's be the first vendor to actually have functionality in place for that.
[00:27:52] Or now people are struggling a lot of inbound volume.
[00:27:56] Let's shift focus on that versus like outbound sourcing.
[00:27:58] And so not having these longstanding commitments, we then have to like pull back.
[00:28:03] And, you know, maybe we set expectations to customers really allowed us to be much more responsive to the market.
[00:28:09] And also on the engineering side, we can experiment with things and then decide it actually not worth doing.
[00:28:15] Because it happens sometimes where we want to build a thing that we think will be quite valuable.
[00:28:20] And then we start looking at it in like a couple of weeks and we're like, oh, this will actually take a really long time.
[00:28:25] There's these three other things that we can do at a time instead that are actually higher ultimate value.
[00:28:29] And so I think it's just been really healthy for us.
[00:28:33] And we've instead really just focused on how do we make the product move as quickly as possible, which means that we don't have to make these longstanding commitments because we customers can rely on just the trajectory and velocity instead of like specifics.
[00:28:46] And so we have a pretty clear vision of where we want the product to be in 10 years, but we don't know exactly what's going to happen next like six months or so.
[00:28:52] But what we do know is kind of the key areas of product we're investing in and our general commitment to make the product evolve really quickly.
[00:29:00] I feel like you're playing it very smartly in the sense that you have the vision and you're clearly delivering value because your team is very deliberate about the features and functions that these are things that people have requested.
[00:29:18] These are things we know, we've assessed that we can deliver.
[00:29:21] And basically you're under promising and over delivering in a sense.
[00:29:27] But you've got a North Star, you know how it evolves.
[00:29:31] You just don't necessarily need to say this is going to be in this particular sprint or in this particular quarter or whatever.
[00:29:37] So just going to the example where like a team may start to work on something and realize it's going to take a lot longer.
[00:29:44] I mean, would you potentially like partner with someone who seems to have like done it well?
[00:29:50] Or like how do you think about when to build versus partner, I guess?
[00:29:57] That's a good question.
[00:29:58] We haven't really had, you know, we haven't had cases where individual projects would have turned into like a partnership like that.
[00:30:05] But we do think about build versus partner.
[00:30:08] The scope that we felt strongly about is kind of just like initial set of like these four core modules.
[00:30:14] So again, analytics, CRM, ATS, and schedule automation.
[00:30:18] We knew that we aren't these to be native capabilities.
[00:30:21] And so within these, we, you know, we tend to build everything ourselves.
[00:30:26] There are also clearly things we're not planning to do.
[00:30:29] Do we think then just like align with what the things we're great at?
[00:30:32] For example, assessments.
[00:30:34] You know, obviously there's some stuff that every ATS has around like if you want to administer like a take-home test, there's like the process around it to do that.
[00:30:41] But like there is obviously no in-depth assessment capabilities.
[00:30:44] But when it comes to like scoping projects, et cetera, it's much more for us around the, you know, there are 50 things we could be doing.
[00:30:52] If this thing is like high value but really high effort, we may just deprioritize it in favor of other things in their place.
[00:31:00] There was a nice example of that where a while back we were looking at building scheduled dashboards.
[00:31:06] So we have to, you know, we have advanced capabilities around dashboarding or reporting.
[00:31:09] And we looked at scheduled dashboards like two to three years back.
[00:31:13] And it was like, hey, this is actually a pretty high effort.
[00:31:15] And it's unclear that it's like super, super valuable.
[00:31:18] And so we started with like a trimmed down version where that's now called alerts where you could build like certain custom reports and build notifications around just these.
[00:31:27] And it actually turned out to be a super valuable feature in and of itself was much less effort.
[00:31:31] We eventually shipped the scheduled dashboard thing as well.
[00:31:34] But we kind of learned as we started that project and then changed the scope.
[00:31:38] And we did all of that without communicating anything to our customers on, you know, what would be coming.
[00:31:42] And so I think to your point, another nice benefit of just we can delight people when bigger features just like drop unannounced, which is kind of nice.
[00:31:52] Or a lot of lead time.
[00:31:54] Nice little gift.
[00:31:55] For your customers, right?
[00:31:58] No, that's very cool.
[00:31:59] Are there any tools like where you see where you're seeing AI being used in really meaningful ways that, you know, across the talent lifecycle, maybe in an area that like, you know, you're not going to go into, right?
[00:32:11] Like, is there anything, I mean, you don't have to name a favorite, you know, vendor or whatever, but even just like conceptually or like in a particular, you know, stage of the talent lifecycle, is there anything like intriguing that you think has a lot of promise?
[00:32:25] Yeah, it's a good question.
[00:32:26] I think we're still, you know, somewhat early, but there's definitely areas that we are seeing.
[00:32:30] So again, I think the top of funnel kind of assisting inbound review is an interesting one today, mostly in kind of point solutions.
[00:32:38] But then there's obviously the, you know, the gong for recruiting use case, which is really valuable.
[00:32:43] But there's a question of like, some of that was there pre-LLM, pre-AI, some of the vendors were around before that.
[00:32:48] So I think a lot of that is actually in the recording and the transcripts for people.
[00:32:52] And then maybe the third area that kind of slowly emerging is like really top of funnel sourcing and like finding candidates.
[00:33:02] That's always kind of interesting area of the market because you're competing with LinkedIn on spend and attention.
[00:33:11] So I'm kind of curious.
[00:33:12] There were like a lot of companies in that space before with different levels of success.
[00:33:16] So I don't know where we'll go, but I think people do generally like the idea.
[00:33:20] And it makes a lot of sense to be able to search more semantically for people, which we are, you know, we're more focused on doing everything in your existing set of candidates.
[00:33:28] But I think this idea of like, again, a more natural language search, because yeah, I'm sure LinkedIn will get there at some point too.
[00:33:33] But you can't really easily say, you know, show me engineers that worked at seed safe startups two to three years ago and had five years experience.
[00:33:41] That kind of use case is pretty interesting for the technology we have today.
[00:33:45] Yeah, I agree.
[00:33:46] I'm not, I know LinkedIn just made a bunch of announcements, but I'm not holding my breath that they're going to be the ones to solve this problem.
[00:33:55] But when you think about casting as wide a net as possible, where you know that there's people with potential and there's good candidates everywhere without necessarily the pedigree of, you know, working at a bang or working at, you know, unicorn and, or, you know, going to an Ivy League school or whatever.
[00:34:15] It just seems like, you know, those people are out there, they may not be actively looking, but that doesn't mean you shouldn't reach out to them anyway.
[00:34:23] And if you can do that at scale and you can do it intelligently and maybe even pick up, you know, weak signals of their intent, you know, I don't want to get into data privacy, you know, issues about, you know, tracing people's, you know, digital footprints.
[00:34:39] But it just seems like there's enough, if you could aggregate enough of these, you know, weak signals, you know, it may give you an indication that someone is likely to step up their search or to shift from passive to active.
[00:34:54] Or, I don't know, it just seems like there's a lot of data out there that if you aggregate it and you know what to look for and what is a signal of intent that you can source candidates before they become, you know, hot commodities or before they hit the market or whatever.
[00:35:12] I just think things like programmatic advertising as intelligent as people may think it is, and maybe from a technical perspective, it is.
[00:35:21] You're still just hoping that the right people are paying attention and take the initiative to actually, you know, apply.
[00:35:30] And so if I flip it backwards, like, and I think about even if you're an organization that's sort of thinking about that and there's one leader that's actually responsible for all of those different, you know, sort of entry points and ways in which you can discover talent wherever it exists.
[00:35:51] Because even if you've made those investments, you still have to look across multiple places, right?
[00:35:59] They've got to look in Ashby and they've got to look in the internal talent marketplace and they've got to look in, I guess if you don't have Ashby, you've got to look in the CRM also.
[00:36:09] And then looking in your contingent labor, you know, platform, which may belong outside, you know, HR in a lot of cases.
[00:36:18] So it still seems like there's a lot of, you know, hurdles.
[00:36:21] Have you had customers talk about or request that there's some level of integration with, you know, like an internal talent marketplace or with a contingent workforce so that you get a full view from a CRM perspective of people you have access to?
[00:36:37] Yeah, I mean, we do have a ton of integrations with kind of, you know, a variety of like what I just call generally top of funnel tools, whether it be job boards or other sourcing tools or other marketplaces, etc.
[00:36:49] So there's a ton there.
[00:36:51] Internal mobility also definitely matters to us to a healthy degree.
[00:36:56] It's like something where we can do quite a bit more, but customers are using us for that today already to kind of run their internal hiring processes and also in some cases kind of search through internal talent.
[00:37:09] You know, I think it depends really on the size of company you're working with.
[00:37:11] For our customers today, internal mobility is a lot about making sure that they advertise the opportunities internally well and are able to run a good process.
[00:37:21] And I think over the last year and a half or two years, there were some roles that were primarily advertised internally because companies were hiring, weren't growing headcount lots that wanted to give people opportunity to move in the company if there was like less demand in a certain area.
[00:37:35] So that definitely matters to us for terms of other sorts of candidate.
[00:37:40] That is much more an integration thing for us.
[00:37:43] So we definitely played air in terms of integrations, but not as much natively in our purview.
[00:37:48] I don't know how long until there's more mass adoption of how we think about the dynamics of the workforce and their own sort of agility in terms of talent attraction and talent mobility and things like that.
[00:38:03] But I think we'll get there.
[00:38:04] We're probably, most organizations, like you said, maybe it depends on the size of the organization.
[00:38:09] But yeah, we're probably still at least a few years away from that being a common use case.
[00:38:16] I agree.
[00:38:17] So Benji, when you hear that phrase like elevate your AIQ, I mean, what do you think people need to do in this AI-driven future of work?
[00:38:28] What do you think people need to do to sort of up their game?
[00:38:30] I actually think even at a higher level, one of the trends we've seen in recruiting specifically in just like last two years or so is that there is more emphasis on tools in general and automation.
[00:38:44] And AI is kind of one part of that.
[00:38:45] But kind of back when we talked about scheduling, for example, I think, you know, five years ago, you could definitely get away with some level of resistance of adopting tools.
[00:38:53] And just like, hey, this is just the way I do things.
[00:38:55] I'm a little bit more artisanal.
[00:38:57] And I, you know, I don't necessarily want to engage with like new and fancy technology.
[00:39:02] But I think it's really changed in the last two years or so.
[00:39:05] So recruiters that really do well in this environment, we just see them lean into the tools really much more.
[00:39:11] And with an Ashby, that's like just as our, you know, vantage point, it is like things like scheduling automation or even things like learning keyboard shortcuts that they may have not known before to move to a product really quickly.
[00:39:22] And AI is just like one other tool in that toolbox.
[00:39:25] So probably the biggest part of the answer is really just like people have to get comfortable learning again and being willing to be kind of a beginner in something and changing their workflows potentially somewhat significantly.
[00:39:36] Where even in some roles that historically maybe didn't engage with technology as much, but there's just so much opportunity to get time back if you're kind of willing to invest in learning things up front.
[00:39:47] No, that's great.
[00:39:49] Benji, thank you so much for taking the time.
[00:39:52] To have this conversation with me.
[00:39:53] I know it's been an overdue conversation.
[00:39:55] We've chatted over LinkedIn and over email and it's been a few years.
[00:39:59] So I'm glad to finally have this one-on-one conversation.
[00:40:03] And I think you shared a lot of insight with the audience.
[00:40:06] So greatly appreciate it.
[00:40:08] Awesome.
[00:40:08] Thanks so much for having me.
[00:40:10] All right.
[00:40:10] Thanks everyone.
[00:40:11] Until next time.


