…the most impressive data science teams that I know, have a very healthy skepticism of the value of data science, they know exactly where it fits in the value chain.
An interview with Gil Dibner, the founder and the general partner at Angular Ventures, a $41M fund investing in early stage enterprise deep tech startups.
Peter Zhegin:
Hello, and welcome everybody, my name is Peter Zhegin, and I'm talking to Gil Dibner, the general partner at Angular ventures, an early stage fund that focuses on enterprise deep tech. It's a pleasure to have you here, Gil.
Gil Dibner:
Hi, Peter. Thanks for having me.
Peter Zhegin:
Awesome. To kick off, tell us a bit more about yourself and about Angular?
Gil Dibner:
Really briefly, I'm an American Israeli in London. And have been doing venture for about 15 years. Started in Israel did VC over there for about seven years, I was recruited to London by Index Ventures got to know the European enterprise scene, decided it made sense to raise a fund that would try to bring a bit of an Israeli deep tech, focus, focus on early stage, focus and enterprise deep tech style see to the European ecosystem.

That gave birth to Angular in 2018. So Angular is a $41 million Silicon Valley style, early stage venture firm. We write checks of $250K to $1.5 million, were usually either the first or second check in the company's life. Will invest anywhere pre-A [stage], geographically speaking anywhere in Europe and Israel. We've even done one or two that are outside of those geographies. We have one in Hong Kong and one in Canada. But the bulk of what we do is Europe and Israel. Almost all of our companies are focused on penetrating the US market as a big part of their early go to market strategy. And they're all either enterprise or deep tech or both. And as it's probably gonna be relevant to this conversation, that means we do a lot of data and data pipelining and machine learning, AI tooling and stuff like that.
Peter Zhegin:
Tell us about a couple of companies, just to give a flavour of what kind of startups you support?
Gil Dibner:
I think you know, most relevant to your audience is we have something pretty exciting in the data ops space that I can't talk about yet, but it's very early. And it's data Ops, we have a bunch of companies that are using data/machine learning/deep learning to provide insights in specific vertical. So we've got one of those in agriculture, one of those in field service optimization, and one of those in logistics/payments, operations, we can talk more about that.

And then we have a bunch of companies that are in various places in the deep learning/machine learning stack. So we have three bets in that category, we have one called Colabel out of Berlin, that is in the democratisation, space, so hardcore democratisation that was making it super easy for any knowledge worker to deploy a machine learning model, way off on the democratic end of the spectrum. And then we have another one that's way off on the non-democratic end of the spectrum that is trying to reimagine the way that deep neural nets are designed, built and optimised. So they're, they basically built the graphical IDE for deep neural network design. But it's really targeted at the most elite, most advanced data scientists that are that are handcrafting specific DNNs. And then we have another company called Valohai out of Finland, which is basically building a end to end machine learning ops platform. And you know, their tagline or one of their tagline says, you know, the last MLOps platform you'll need to buy. The idea is there's so many tools here that make sense to try to become the end-to-end platform for multiple stakeholders with multiple needs without being opinionated at all about which infrastructure you're using, or how you're deploying and building models. Just let's try to bring this under control much the way that GitHub and GitLab did for normal code. So we can we can talk about that as well.
What does it mean to build an enterprise deep tech startup?
Peter Zhegin:
That's a great overview. And I guess it's pretty much describes, what does it mean to be deep tech. So you mentioned the stack and you mentioned some verticals. There is another interesting word in describing Angular which is 'enterprise'. Sometimes not every founder, especially if he or she has, like research background, understands the difference between B2B, SMEs, and enterprise? What does it mean to be an enterprise deep tech startup?
Gil Dibner:
So it's a really great question. I actually wrote a blog post on this very topic, you know, how do we define enterprise? How do we define deep tech, because those words get used by a lot of people, and I think to mean different things.

VC used to be pretty, obviously, about tech. And I think, mostly because it was about finding companies that were high growth, and they had some kind of barriers to entry that could sustain their growth or sustain their margins as they grew. That was so obviously, technology, in most cases, that VCs didn't have to say, I'm a technology VC, it was sort of implied.

Then with the internet, you had the growth of, of the emergence of very high growth companies that were not really tech companies. I think, you know, what one can have a debate about whether WeWork was was tech or not, it was clearly a high growth company, at least at some point in its lifecycle. And there's many other companies like that, you know, Zalando, and many others that did amazing things, but weren't really technology in the sense of it had a barrier to entry, or it had some kind of rocket science approach, or there was some kind of deep tech innovation.

I think some VCs, like myself started using the word deep tech to try to capture the idea that we are looking for companies that are taking some kind of technology or development execution risk, when we make the investment and that we're trying to find companies that are on the bleeding edge of the technology curve, and trying to find companies that have some potential opportunity to create a defensible barrier to entry over time.

There's some caveats to that, that. I've been around the block long enough to not really believe that there is such a thing as intellectual property in software. Back when I was starting out as a VC early in my career, VCs that were way more experienced than me when I would sort of ask naive questions about intellectual property or defensibility barriers to entry, or can this be replicated? They would sort of laugh and be like, don't you know, there's no such thing as intellectual property in software? Don't you know, that you put any problem in front of developers that are smart enough, and they'll figure it out. And I think that's generally true. So it's a very nuanced thing of like, how much defensibility is there really baked into these products and defensibility comes from a lot of different places.

I think by saying deep tech, we're sort of emphasising that we look for that defensibility. And we want to develop a thesis around defensibility to the extent that we can. There's many sources for that, we can talk about the sources. But the other thing that it's probably worth mentioning is that we also use the phrase to aspirationally capture the idea that we will be glad to invest in very cutting edge, almost sciency sort of projects, whether that's synthetic protein, synthetic biology, you know, nanotechnology, crypto stuff like we are open to those things, space tech.

As a practical matter, because I ultimately want to return capital to my LPs, I filter all these opportunities through the filter of will we make money on this. And that often rules out some of those more moonshot things at the early stage. I would also argue that there's definitely a reality that there are companies that actually make sense to put a $20 million check in that don't make sense to put a $2 million check. I'll give you an example. I met a company recently in Israel that has a completely novel approach to medical imaging, it's completely novel. No one's done this before, it revolutionises medical imaging, they want to build a diagnostic device and medical imaging device, they want to go compete with Philips. You're not going to get very far with $2 million if that's your game plan. So I think that's a case where it actually makes more sense to invest 20 million to invest 2 million and that can become prohibitive for a small fund that specialises in smaller checks. So that's what sort of on the on the deep tech side.

On the enterprise side, just to finish answer your question. I'm sorry for the long answer. Enterprise also has sort of, you know, two meanings, right. So B2B is probably the overarching category where the decision maker for the product as a business. And Angular is exclusively B2B focused. People ask me what that means, my simple answer is if your company requires consumer adoption to succeed, it's not for us. I don't care if it's B2C, B2B2C, B2whatever, if it requires consumers, like my mom or my sister to figure out if it's interesting, it's not for us, I don't have that skill set. So B2B means the ultimate decision maker on whether we're going to buy your product and how to pay you money or not as a business person making a business decision.

Enterprises, sort of a sub sector of that they're kind of enterprise and SMB. And enterprise can mean, you sell to large companies, that can be one sort of definition. And look that from another angle. Enterprise has a it's almost like a shorthand. There's a whole universe. You know, in Hebrew, that's 'the whole Torah', of like, what it means to build an enterprise software business, right? There's a certain way you sell, there's a certain way you package up products, it's a certain way you go to market, there's a certain type of salesperson you hire. There's a whole army of people that call themselves enterprise software sales executives, and that's what they do. Before Coronavirus, they would get on planes and put on suits and close big deals. And I think, you know, that world is slowly changing. And we can talk about that.

But I think that's also something that enterprise mean, so it either means you're selling to large customers, or it means you're selling a product in a certain style of sales. It's a very well-trodden path. Now, it's important to have the plasticity or the plasticity of mind that that traditional enterprise sales model is, is fairly rapidly vanishing from the world and being replaced with other things. But, but that's another definition.
Product/founder fit, aligning team skills to the sales process
Peter Zhegin:
You've mentioned several important things that I included into this enterprise umbrella: special sales processes, I assume there are special kind of buying processes, purchasing processes, some security, cyber security standards, etc. that enterprises have, and other types of customers don't. How does it affect the skill set, or the composition of the founding team of a startup?
Gil Dibner:
Great question. It's a good question. I don't have a simple canned response to that, I think. I would say probably doesn't affect the makeup of the founding team or determine success. But I think... the journey of any founder, right, you know, we talked a lot about, you know, product/market fit. And most people know what that means. And then people started talking about founder/product fit, or founder/market fit. You know, do you understand your market? Does this market speak to you and motivate you enough to spend the next 10 years of your life building a product market.

I think there's probably also a concept of sort of, you know, sales motion/market/product fit. In other words, this sales motion, all of these things, I guess, the bottom line is all these things have to be aligned. Who the founders are, what the product is, what the price is, how it's packaged up and delivered, how it's marketed and sold and purchased. All those things have to make sense, and have to be aligned. And I think the journey of any company, of any founder, is to map that universe and try to understand what's my best path to fast growth?

And I'll give you an example. We have a company, our portfolio that sells automation software to the oil and gas industry. And the oil and gas industry is, ... we have same exact dynamic in the healthcare industry. So oil and gas and healthcare are two markets that have for the most part. (there's a lot of innovative startups and a lot of innovative purchases of tech) but for the most part, the majority of buyers, and IT executives in those industries have a pretty set notion of how they're going to buy software, and how they want that software sold and how they want that software priced.

And we're having this back and forth actually, in both companies, we're having a constant tension between the VCs who are like Silicon Valley VCs who want to see the company ARR, recurring revenue, per user per seat, whatever usage-based pricing. And the customers who say - 'no, I just like, I would like to pay you for a 10 year license'. Right? And that's complicated. 'And I would rather pay you for professional services than for license'.

And [startup] CEOs - 'I'm happy to give away services for free. But I need to charge you for the software'
  • 'Like I can't pay you for software, but I can pay as much as you want for professional services'.
  • 'But it's the same thing?'.
  • 'No, no, I want to buy it this way'.

So there is a process that all companies have to undertake of aligning themselves with the way their market wants to buy and maybe very gently and gradually educating people in their market as maybe there's a better way to do this. Maybe there's a more efficient way to do this.

To get back to your question, I think, sometimes a founding team has a pretty good idea, or can get a pretty good idea pretty quickly about what this is going to look like. An it's almost like it's almost like dropping a rock star into entourage, right? We know we're going to need a sound guy, we know we're gonna need a lighting guy, we know we're going to need a bunch of roadies, we know we're going to need, you know, all of this infrastructure around us. And we'll just drop, whether it's Madonna, or Springsteen or Beyonce, it doesn't matter, you can just drop that person in, and it'll all work fine, right. And then other times, you really find yourself struggling to figure out exactly what the path is going to be. And how your market operates. And this is especially the case when the markets never seen anything like it before. So for example, in the healthcare example I was giving you, they're selling remote patient management and monitoring solutions. They're basically competing with system integrators, there's no software vendor, certainly no established software vendors that provide what they provide. So customers can't even imagine that you can provide this out of the box configurable self-service, it doesn't even occur to them that you could do. So there's a lot of education that has to go on.

Where the value of a product might be, apart from data science?

Peter Zhegin:
Regarding this journey of finding insights, understanding how the market works, what customers want, maybe there is a specific advice/framework relevant to data science startup founders?
Gil Dibner:
So I think in data science, it's a fairly specific set of problems that are different than other deep tech disciplines, I think in other disciplines is often a tendency to try to license IP, or to try to figure out how to slot there's kind of two extremes. One is and they rarely work, one of them is to say, I'm going to find a way to isolate my IP, and just license that IP and it very rarely becomes a scalable business. The other extreme, say, 'Okay, I'll just build the whole system myself, I'll build the car, I'll build the ultrasound, I'll build the satellite'. And that's also super difficult, because then it turns out that the the value prop that you have, is actually a tiny sliver of a massive stack.

I think you have a similar phenomenon in the deep science and machine learning world, although it has a different flavor. And I think, what I observed in many of my interactions with data science driven teams, PhD driven teams there is that there tends to be, you know, they say, if you're a hammer, everything looks like a nail. If you're a PhD in mathematics, or statistics, or, you know, in CS, if you're a data scientist, everything looks like a data science problem. And I think people tend in those disciplines, they tend to overweight, the value of the data science and underweight the value of the rest of the product, and the needs and pains of real users in real life. And, and that sort of boils down to product.

In other words, there is there's technology and there's a product, and they're not the same thing. So oftentimes, the model is not where the value is, the value is in the product complexity, the values, understanding how to communicate ROI to customers. The value ultimately is in building a system that is easy to deploy and hard to remove. A system that people love using a system that spreads its tentacles through an organisation and really can't be removed.

I'll give you like, a classic example from, you know, 10 or so years ago, is people now are probably too young to remember. But they used to be these things called tablets that were all the rage after the iPhone, that was the iPad. And there was a point in time where every enterprise software company was building an iPad version of their offering. And the reason was not that this was value added. The reason was that people loved it, that customers loved having a reason to get IT to pay for an iPad, and they love being able to walk into a meeting with their iPad and show their colleagues. 'Hey look, here's my dashboard for... know how much how many widgets we manufactured yesterday'. And, and that was not measurable ROI. But it was, from a product perspective, very important in terms of building the exposure.

Another common, you know, trick is trying to figure out if you can develop a screen that your users' manager will want to see, right. And that can go all the way up the stack. So if it's a developer tool that we'd like to give the team leader a screen, right, if it's a tool for team leaders, and we'd like to give the VP engineering a screen, if it's a tool for CFOs, we'd love to give the CEO a screen, right? So it can go all the way up the stack. And ideally, is to get your software to have a screen that goes into the board meeting, right? So the quarterly board meeting, there's a screen from your software, you know, you've won, right? So classic example, would be like a Salesforce or SAP. Oftentimes, board presentations at huge companies will have a screenshot from directly from Salesforce or SAP. And that's part of why those companies are, you know, it's a symptom of the success of those companies, right, they are so deeply embedded that they're all the way up in the boardroom. So that's those things are not about algorithms or specific data science problems. They're about wrapping your software up in a form factor that becomes very, very compelling.
Types of data science problems/startups that one may build
Peter Zhegin:
What I would love to do probably is to make a step towards like a discussion at a deeper level, and to talk about the data science types of data science problems. I know that you mentioned at least three types of problems, right, native problems, easy problems and hard problems, and there is a kind of evolution in applications that solves different problems. Can you talk a bit more about that about these distinctions? And where you see the most value for data science is or will be?
Gil Dibner:
A really good question. I'm not sure I I know what you mean by the distinction of hard and native, I think of as there's places where you need a bespoke model. I think there's some subset of applications where no vendor is going to show up with a model, you need to build the model yourself. And I think there's that that's part of the world, I think, increasingly, we're going to get to a place where most by number, maybe most of the value will be in those bespoke ones. But by by volume, by number of users, most problems that would be considered deep learning problems today might actually be sort of off the shelf problems tomorrow. Things like, you know, speech to text, NLP, voice recognition that those kinds of things, speaker identification, emotion detection, entity extraction, things like that, where you're probably going to be able to pull models from the public domain and deploy them. And they will work well enough in most cases with a little bit of tweaking. And they may even become baked into other software. In fact, that's already happening, right?

Almost every company in our portfolio has some degree of machine learning involved in some embedded aspect of their software, and no one really thinks about it. Some of them are building it themselves, some of them are using off the shelf models, it doesn't really matter, because the customer doesn't care. I think that's also happening. So understanding if it's a situation where a very specific model for a very specific data set is what's necessary, or can it be more generic and still get a lot of the benefit? I think that's that's one of the distinctions that would make.

I think another area that is quite interesting is the extent to which a model needs to be continuously trained or not, and the extent to which it needs to be part of a human workflow or not. That's an interesting distinction. Right? So understanding, is it bespoke as a generic? Is it ongoing, that need to be adjusted in tuned on an ongoing basis? Or can it sort of be fired and forget, for the most part, right? Are we seeing the same use case over and over again? Or are we always injecting new images, new categories into the model that continuously need to train it? Is it a model that can operate by itself? Or is it a model that that sort of operates hand in hand with a human operator? And I think that those are the most interesting use cases, because they're the hardest to replace, because they involve a human workflow?

And I think a fourth distinction is the degree of explainability that's required. So there are applications where explainability is not necessary. And its application or explainability is vital. There's applications where you can't take a step without explainability. And a backbox that that, you know, perfectly identifies anomalies in logistics or anomalies in a supply chain may be totally useless unless you can explain why is it an anomaly and what do you propose to do about it, so that a human can sign off on a shipment, say - 'yeah, okay, this needs to be reallocated over here'. We are years away, if ever from having companies automatically do handoffs, their supply chain to an unexplainable AI system. But if you give someone an explainable system, they says - 'this is what I detected, this is what I want to do about, it just to prove it, yes or no'. Then you suddenly have a human workflow, where the enterprise can feel very comfortable, what's happening, and also where the model is positioned to then learn from the human. In other words, in those rare edge cases, where your model is wrong, it doesn't matter, because you have a workflow. And you can learn from that. And you can float those exceptions and then you can start to refine the model over time. I think those are some pretty sticky use cases.
Peter Zhegin:
Yes, that's a great lens. And I was referring exactly to that one, the one that covers different types of workflow. So native workflow, like everything is digital, Facebook, Google digital domain, then easy problems that are about kind of hybrid workflow like CRMs, there is something digital, but there are some offline processes. Hard problems that involve purely offline, non-digital domains, healthcare, self-driving cars, that kind of stuff. So I was referring to the distinction between different types of domains.
Gil Dibner:
I think there's the I this is probably baked into what you said, but just for clarity, if you take problems that are hard or critical, like for example, identifying whether a tumor is cancerous, or avoiding a pedestrian in a self-driving car scenario. Those are situations which may be hard algorithmically or computationally, they may be difficult from an acquisition of a data point of view, like you need to get a lot of lung scans before you can identify lung cancer. You need to have a lot of pictures of pedestrians crossing in fog, wearing all kinds of different clothes, before you can reliably avoid pedestrians. But explainability may not be that important. Whereas there may be other cases, like loan approvals, or even you know, is Uber going to pick you up or not? And where it's very, very important to be able to explain those decisions, even though they're very simple decisions. Because you don't want to face a court case of saying - 'Oh, you discriminated against me or you, you rejected my loan, because I'm young, or because I live in this certain area, because of my gender' - you don't do that. So the business criticality, or even the life and death criticality of a, of a decision or a model is distinct from the necessity of explainability. There are two different axes. And I think they're both important.
Recognising a problem to work on – domain expertise
Peter Zhegin:
So we just demonstrated, how multidimensional is choosing the problem, basically. Based on your experience, based on observation of hundreds or maybe 1000s of startups, how a founder or a founding team arrived to the problem? Is it more like analytical process or it's more like creative processes? How a data scientist or a founding team can find this problem, actually?
Gil Dibner:
So I'm a huge believer in domain expertise. I think domain expertise should be deep, and multidisciplinary and authentic. We live in an era where there's tremendous short term benefits to successful pseudo entrepreneurship. In other words, you have a PhD in computer science, you get your hands on a data set, you build a model that proves something interesting and predict something interesting for one customer and one use case and one thing - boom, startup is born. Let's go raise millions of dollars and go build the company. Sometimes that works. But as a result of that, sometimes working, a lot of people are trying it. And if you take a long view of their career, they're missing out on depth of things, experience and an authenticity of experience that I think informs the most interesting category defining companies out there.

I mentioned this company oil and gas, the founder of Crux OCM, which is that company, she herself is not a data scientist. She's a chemical engineer. She came from the oil and gas world. She understands the chemistry, she understands the science, she worked in control room, she understands operations, she understands the personas. She understands what it's like to be in a room in the middle of northern Canada with ten 50 year old engineers who are responsible for making sure a pipeline doesn't blow up, which is a different kind of problem than optimising the flow of oil through a theoretical system. Her partner colleague, is getting even more years of experience. And then she has deep systems engineering, worked all these companies.

So there's a real depth, within that domain, there's a multi-disciplinary set of experience that really shapes - 'Okay, we understand not just the specific technical problem, but we understand all of the stakeholders are going to use this thing. And we understand all of their needs, and all their concerns and how to and what their life really looks like and all of the potential blockers to adoption'. And, and that shapes the way that Crux is going and building their solution. And I think it's making their life a lot easier.
Finding a co-founder, skills of a great CEO
Peter Zhegin:
How a potential founder could find the partners, you mentioned a domain expert, and a co-founder? Where a really strong and well-balanced founding teams come from? Is it about people who work together, or studied together or some other options?
Gil Dibner:
I think, work together is obviously very helpful. Even people who worked in other companies for enough time and were successful, other companies can work better with other people. Half our portfolio is Israeli, I spend as much of my time in Israel as I can. And in Israel, there's usually military experience for a lot of these technical founders. And I think the military experience is both teamwork and domain expertise. In other words, they have dealt with these kind of data problems, kind of scaling problems, with a objective in mind and in a team setting and with multiple stakeholders.

I think all of us have our biases based on our experience and our strengths. And I think the best companies, and teams, and founders, and CEOs are those that know how to contextualise their own strengths and weaknesses in the broader context of a company and of an industry, and know how to really, really value and appraise, and evaluate skill sets that are different from them.

A great CEO doesn't have to be great at marketing, doesn't have to be an engineer, doesn't have to be great a data science, but should understand what the difference between a great data scientist and not great data scientist, to understand what the difference is between a great marketing person and a so-so marketing person.

Assuming your audience is mostly data scientists, just be cognitive the fact that you're probably going to over-index on things like data science, and on things like system engineering, and data architecture and stuff like that. And you might under-index on how much you appreciate a great product, or great marketing, messaging or sales. You're going to have to sell this thing, someone is going to have to sell this thing.

I found, for example, one of the best shorthand for evaluating company, just pretend you're the world's greatest account executive, world's greatest salesperson. Would you work for this company on a commission basis? If the answer is yes, then you probably are onto something? If the answer is no, you might not do that. Then you got to ask yourself, why why don't you think you can make a lot of money? If you're getting 20% of your sales, you should be able to make a killing as a successful salesperson. And so I think that's something that technical founders takes them off in a long time to get comfortable with how the rest of the company thinks and operates. And ultimately, what you're building on the technical side is sort of ammunition for your sales team to go sell something.
Building entry barriers
Peter Zhegin:
Great framework to think. Another thing that a lot of founders and investors as well struggle, is entry barriers. I know you guys at Angular have done a lot of interesting work there. Can you maybe highlight how this entry barriers evolved during the last maybe 20 years?
Gil Dibner:
Yeah, I wish I remembered the list of various entry barriers that I put in a blog post. I think the fundamental point here is that technology is always cannibalising itself, stuff that was impossible 20 years ago is going to be trivial 20 years from now. So we're constantly on an eroding curve of defensibility, and the uniqueness of technology and so on, so forth. And I think there was a time probably 10 plus years ago, that the idea that you could use a computer to automatically categorise radiology images, as, you know, malignant/non-malignant, for example, that was itself sacrilegious or revolutionary. Now, it's not. Now there's 20 startups a month cropping up to do various versions of that, in just that domain. And in many others.

I was looking at a company in Israel doing data labelling, and they were giving me a use case. And you know, one of the restaurant chains in America was using image classification to detect mice in restaurants to comply with health standards. So they had 1000s of photos of mice, that they were like, you know, do we have a rodent infestation in our branch of this restaurant or not? So what I'm trying to say is that is an example of the decreasing barriers all the time.

I sort of like to go into a conversation with a founding team, kind of assuming that there are no barriers. And let's see if we can come up with something that would be a barrier. I think, if you can get proprietary data, not data itself, but proprietary data that use of your system generates, that's kind of interesting. And I think that's one of the reasons why hybrid machine human workloads are so interesting. Because when humans are involved, they're generating data if you can catch it. That's one that's interesting.

I think system complexity itself, is a kind of barrier to entry. That in of itself is a barrier to entry. And what that means is like, all the wrapping around the core algorithms in the core functionality, the integrations, the data handling, the chat functionality, the workflows, all of that other stuff, is just the sheer complexity and a bit of execution on it can can become a barrier to entry in some cases. Those are the ones that I see most often.

Sometimes domain expertise, and just knowing how to architect the system, knowing how to architect the data schema, the willingness to invest in difficult architectural decisions early when the payoff is not immediately apparent, because you know, what's going to happen means that by the time your competitors realise that you've done it, right, they're quite a few years behind you. And even though they can catch up, you're already ahead. Those are some of the ones that that we see, occasionally.
How to evaluate the traction – backing renegades and focusing on one thing at a time
Peter Zhegin:
You've listed quite a lot of these barriers that are specifically relevant to data science heavy startups. When you look at the company as a very early stage, what do you expect to see, what kind of milestone, what kind of signs you try to catch to understand the company better?
Gil Dibner:
So I have not yet found a good way to answer this question, because it gets asked all the time. And I don't I don't know how to answer it. I really don't think there's an answer. But if you look, practically at the investments that we've made, the investments that I've made over my career, even before Angular, they were all over the map in terms of their stage and their readiness. What I mean by that is we have a term sheet out right now in a company where the founders haven't quit their job yet, and have nothing. And we have other term sheets that we've written when companies had significant revenues.

There's always something you know, it's either that we're moving fast, or that we're contrarian. I think there's our investments fit into two categories. Either the opportunities that they show up in front of us, and we have a prepared mind, and we think we understand a space, we think we can wrap up... and we just want to get going and make it happen.

And then there's other investors that we make where we think that everyone's pretty much looked at this company and overlooked it for some reason, or failed to see something that we think is true. And we're willing to make a bet that others will not. And those are the majority, what we do. That can be a contrarian bet about a founding team, or about market, or about a technology, or about approach, or about a go to market strategy.

We're not investing based on specific traction. And I think in today's overheated market, the formulas that VCs use, and there's so many VCs out there, that if you're trying to play the traction game, how many GitHub stars you have, how many downloads you have, users, revenue, customers - that becomes a pretty hopeless battle, it's very hard to be contrarian on deals like that. So what happens is, you end up getting into these bidding matches, who's going to pay the most, the fastest?

And I don't think that that's long term where VCs make money, I think we make money by backing renegades, radicals, outliers. You know, and we do it. You know, that the trick is, you have to bet that they won't be renegades outliers in 18 months, that you're catching them at that inflection point, with a little bit of guidance, a little bit of cash, a little bit of execution, they can turn that heresy into conventional wisdom within 12 to 18 months.
Peter Zhegin:
When we talk about traction about founders in the very early stage, there is an interesting concept I found in your blog, it's about one thing, one thing that can de-risk, the next fundraising round. And one thing that a founder needs to focus on. Can you elaborate a bit about that approach?
Gil Dibner:
Yeah. So what I was trying to say in that blog post is that it's basically a continuation, what I'm saying about this contrarian.. the idea of being a contrarian.

I think, both for founders and VCs, when an early stage investment happen, this is specific to these contrarian investments to the risk investment, less specific to the investments that everybody wants to do, and everyone's tripping over themselves. Those become an access game, as opposed to a selection game.

On those contrarian deals, contrarian investments, I find it particularly useful to try to identify, you know, the one thing that is the most significant contrarian bet that you're making, and the VCs and the founders to try to be explicit about it.

In this specific example of that article, we were talking about a company that were selling into an industry that was just not accustomed to buying SaaS. And they certainly weren't accustomed to buying SaaS quickly. And the customers in that industry thought that most of their software, they were going to buy as complicated customised professional services. And these guys were saying 'no, it's a, you know, one size fits all, highly configurable SaaS cloud based architecture, you're going to deploy, and you'll configure yourself in in a matter of days, and you'll be up and running without any expensive professional services. And we're not going to write any custom code for you'. And the one thing we had to prove, like, you know, we knew the CEO is great, we knew the market was real, we knew the technology worked. We knew the price point was correct, we knew that the market was gonna be large enough to be able to sustain significant revenue. The only thing we didn't know is if customers would be at all willing to buy it this way. And if this company could sell that. There's kind of two sides of the same question. Can I convince you? And will you be convinced. And that was the thing, like, that was the thing. That was why we were there.

In other words, that was the reason that everyone was saying this wasn't gonna happen. This wasn't a good investment. That was the thing that we had an opinion about that was different than the majority of VCs who looked at this deal. That was the thing the founder was going to be focused on. The founder, for example, didn't have to prove to me that he could run a company. That wasn't the issue. He didn't have to prove to me that he knew how to incentivize a sales team, or architect software, or whatever, that wasn't the issue. The issue was, can you get large accounts to buy this quickly? I think by now, we pretty much proven that. But the theory was, if we proved that point, then all the objections to the previous round fall away, and you've basically got the next round set because people will buy that logic.
Peter Zhegin:
And was that particular issue, that particular thing, obvious for everyone? Or founders and investors like you needed to do some work to ask many questions, 'why' or 'what' and to arrive to exactly that thing?
Gil Dibner:
I think in this case, it was pretty obvious. Because first of all, everyone who said 'no' said 'no' for this reason. And we'd looked at it with a bunch of these other investors. So all my friends were saying 'no' for very consistent reasoning. But also just having done this for such a long time it was the question on my mind as well, it was the number one reason that this wasn't gonna work. I think that was actually pretty obvious. The non-obvious part was was what are the mitigating thing? Why is this maybe not so difficult? Why is this doable?
Closing remarks – value of data science to a customer and to a startup itself, raising VC
Peter Zhegin:
Just to be cautious social time, two last questions. We discussed quite a lot of important things. Is there any topic or theme that I didn't ask you about, but you believe it's important for founders with data science or research backgrounds? Anything that we need to discuss, but we didn't?
Gil Dibner:
I mean, I think you did a pretty good job of covering most of the of the topics, at least from our perspective, as I'm a business oriented person, I'm not a data scientist myself. I don't, I don't have awareness of the, you know, leading edge to specific models and so on.

The only thing I would say is that the most impressive data science teams... I guess there's two.

One is the most impressive data science teams that I know, have a very healthy skepticism of the value of data science, they know exactly where it fits in the value chain.

The other point I would make is that I think, I'm not sure how clearly to make this point. But I think there's a very subtle, a very important distinction between 1) data science, where the value of it is that the customer gets. In other words, my prediction is more accurate, or my software is more automated, or I can identify a higher percentage of next actions, or whatever it is. Or 2) data science, where the function of the data science is to accelerate the sales process. And I'm a huge fan of the second.

For example, when you have a company that can automate a process, and the thing that you're automating the data that underlies it is free text. Then really good NLP models can allow you to integrate with customers a lot faster than traditional approaches would. And so you can actually sell a lot more rapidly than competitors that might not have as good models or might not have models at all, or might need to manually tag columns in a database, or even fields, even individual entries.

[It's kind of two scenarios] One - I can deploy my software, but I have to build a model of semantic model of your universe first, versus - a company that says 'don't worry, give me your data, I'll automatically generate a semantic model. And then I'll be able to categorise all of your all of your issues. And then I'll be able to show you them automation works'. If you can do that, you're deploying your data scientists, very directly as ammunition for your sales team. And those are the company's I find the most exciting the data science field, where they're basically saying - 'how can we use data science almost as a to serve our VP sales more than to serve our customers?'.
Peter Zhegin:
It's a very interesting distinction. It will be very useful for a lot of tech founders who really sometimes overestimate the weight of the prediction side. Probably the last question for today, any advice, the single piece of advice for someone who is about to launch a startup?
Gil Dibner:
I guess my advice would be just given the current climate, and the stage of a lot of these companies, you know, the person you're describing, it's very early in their journey. My advice is, think about scalability and profitability. And then think about scaling revenue. And then think about raising money as opposed to starting with raising money and then think about those other things.

When you raise money from VCs, you're basically taking on a bunch of debt, and a bunch of commitments to investors who expect certain things like growth and running a company and certain kinds of behaviours, executions. That put a lot of demands on founders. And there are a lot of brilliant people that can build fantastic, profitable businesses that can make them very wealthy long before they have to raise venture money, if at all.

In today's climate, when it seems so easy to raise money, there's VCs all over the place. It's easier to raise a series A, I think, than it is to actually build a scalable, repeatable sales model, and a product that can become a category leader. And so a lot of founders, I think you're going to find themselves kind of stuck with cap tables that are burdensome, where they could have probably gotten to a nice, profitable, happy outcome, and made a lot of money for their families, and themselves, without having to take on the pain of dealing with VCs like me.

And that I think raising money from, from VCs or any external investor for that matter is really about aligning objectives. And take it seriously and approach it with caution. And make sure that everyone knows what they're getting into. I think a lot of VCs and a lot of founders these days don't know what they're getting into and find themselves kind of surprised by how difficult the building phase actually is, how difficult the next round sometimes is.
Peter Zhegin:
That's an amazing advice. Thank you very much Gil. I'm sure we'll hear a lot of good news from Angular and you.
Gil Dibner:
Can just plug my email?
Peter Zhegin:
Absolutely.
Gil Dibner:
Anyone wants to send me a business plan or a concept or an idea, we'd love to hear about it - Gil@Angularventures.com. The more impossible or ambitious the idea, the more we'd love to hear about it.
Peter Zhegin:
Amazing. Thank you very much once again Gil, speak soon.
Gil Dibner:
Thank you.