

The AI engineer skills gap
Practical AI
What You'll Learn
- ✓The data science job was once seen as the 'sexiest job of the 21st century', but the market has become much more demanding, requiring skills beyond just building models
- ✓The job has evolved from 'my model has 95% accuracy' to owning the entire ML pipeline, including containerization, CICD, monitoring, and drift detection
- ✓Generative AI has automated many entry-level tasks, leading companies to shift their hiring strategies towards proven capabilities rather than potential
- ✓Universities are struggling to keep up, as their curricula often stop at teaching the fundamentals without providing practical, industry-relevant skills
- ✓The new 'entry-level' jobs are effectively what used to be considered 'mid-level' engineers, creating a significant gap for recent graduates
Episode Chapters
Introduction
The podcast hosts discuss the dramatic changes in the AI and data science job market over the past decade.
The Shift in Data Science Roles
The role of data scientists has evolved from simply building models to owning the entire ML pipeline, including deployment and maintenance.
Impact of Generative AI
The rise of generative AI has further disrupted the entry-level job market, as many routine tasks can now be automated.
Changing Hiring Strategies
Companies are shifting their hiring strategies, prioritizing proven capabilities over potential and seeking mid-level engineers rather than training junior hires.
The Academia-Industry Disconnect
Universities are struggling to keep up with the industry's evolving needs, as their curricula often stop at teaching the fundamentals without providing practical, industry-relevant skills.
AI Summary
This episode discusses the significant changes in the AI and data science job market over the past decade. The role of data scientists and AI engineers has evolved from simply building models to owning the entire ML pipeline, including deployment, monitoring, and maintenance. The rise of generative AI has further disrupted the entry-level job market, as many routine tasks can now be automated. As a result, companies are shifting their hiring strategies, prioritizing proven capabilities over potential, and seeking mid-level engineers rather than training junior hires.
Key Points
- 1The data science job was once seen as the 'sexiest job of the 21st century', but the market has become much more demanding, requiring skills beyond just building models
- 2The job has evolved from 'my model has 95% accuracy' to owning the entire ML pipeline, including containerization, CICD, monitoring, and drift detection
- 3Generative AI has automated many entry-level tasks, leading companies to shift their hiring strategies towards proven capabilities rather than potential
- 4Universities are struggling to keep up, as their curricula often stop at teaching the fundamentals without providing practical, industry-relevant skills
- 5The new 'entry-level' jobs are effectively what used to be considered 'mid-level' engineers, creating a significant gap for recent graduates
Topics Discussed
Frequently Asked Questions
What is "The AI engineer skills gap" about?
This episode discusses the significant changes in the AI and data science job market over the past decade. The role of data scientists and AI engineers has evolved from simply building models to owning the entire ML pipeline, including deployment, monitoring, and maintenance. The rise of generative AI has further disrupted the entry-level job market, as many routine tasks can now be automated. As a result, companies are shifting their hiring strategies, prioritizing proven capabilities over potential, and seeking mid-level engineers rather than training junior hires.
What topics are discussed in this episode?
This episode covers the following topics: AI job market changes, Evolution of data scientist/AI engineer roles, Impact of generative AI on hiring, Disconnect between academia and industry, Skill gaps for recent graduates.
What is key insight #1 from this episode?
The data science job was once seen as the 'sexiest job of the 21st century', but the market has become much more demanding, requiring skills beyond just building models
What is key insight #2 from this episode?
The job has evolved from 'my model has 95% accuracy' to owning the entire ML pipeline, including containerization, CICD, monitoring, and drift detection
What is key insight #3 from this episode?
Generative AI has automated many entry-level tasks, leading companies to shift their hiring strategies towards proven capabilities rather than potential
What is key insight #4 from this episode?
Universities are struggling to keep up, as their curricula often stop at teaching the fundamentals without providing practical, industry-relevant skills
Who should listen to this episode?
This episode is recommended for anyone interested in AI job market changes, Evolution of data scientist/AI engineer roles, Impact of generative AI on hiring, and those who want to stay updated on the latest developments in AI and technology.
Episode Description
<p>Chris and Daniel talk with returning guest, Ramin Mohammadi, about how those seeking to get into AI Engineer/ Data Science jobs are expected to come in a mid level engineers (not entry level). They explore this growing gap along with what should (or could) be done in academia to focus on real world skills vs. theoretical knowledge. </p><p>Featuring:</p><ul><li>Ramin Mohammadi – <a href="https://www.linkedin.com/in/ramin-madi/">LinkedIn</a></li><li>Chris Benson – <a href="https://chrisbenson.com/">Website</a>, <a href="https://www.linkedin.com/in/chrisbenson">LinkedIn</a>, <a href="https://bsky.app/profile/chrisbenson.bsky.social">Bluesky</a>, <a href="https://github.com/chrisbenson">GitHub</a>, <a href="https://x.com/chrisbenson">X</a></li><li>Daniel Whitenack – <a href="https://www.datadan.io/">Website</a>, <a href="https://github.com/dwhitena">GitHub</a>, <a href="https://x.com/dwhitena">X</a></li></ul><p>Sponsors:</p><ul><li>Shopify – The commerce platform trusted by millions. From idea to checkout, Shopify gives you everything you need to launch and scale your business—no matter your level of experience. Build beautiful storefronts, market with built-in AI tools, and tap into the platform powering 10% of all U.S. eCommerce. Start your one-dollar trial at <a href="http://shopify.com/practicalai">shopify.com/practicalai</a></li></ul><p>Upcoming Events: </p><ul><li>Register for <a href="https://practicalai.fm/webinars">upcoming webinars here</a>!</li></ul>
Full Transcript
Welcome to the Practical AI Podcast, where we break down the real-world applications of artificial intelligence and how it's shaping the way we live, work, and create. Our goal is to help make AI technology practical, productive, and accessible to everyone. Whether you're a developer, business leader, or just curious about the tech behind the buzz, you're in the right place. Be sure to connect with us on LinkedIn, X, or Blue Sky to stay up to date with episode drops, behind-the-scenes content, and AI insights. You can learn more at practicalai.fm. Now, on to the show. Welcome to another episode of the Practical AI podcast. This is Daniel Whitenack. I am CEO at Prediction Guard, and I'm joined as always by my co-host, Chris Benson, who is a principal AI research engineer at Lockheed Martin. How are you doing, Chris? Hey, doing very well today, Daniel. How's it going? It's going really well because I have a close friend joining us on the podcast today and a previous guest. We went through the Intel Ignite accelerator program together in different companies. And yeah, just really excited to have with us Ramin Mohamadi with us, who is an adjunct professor at Northeastern University and also lead principal AI engineer at iBase-T. Welcome, Ramin. It's good to see you again. Yeah, thanks, Chris. It's always great to be back. Yeah, yeah. I've been excited to talk through these things. And even before the show, obviously, you're kind of living in two worlds. You're living in the industry world, and you're living in the academic world. And you've kind of been living in those two worlds for quite some time, which is interesting because you have a perspective on like how, for example, data scientists or AI people or machine learning people are being trained and what those people are actually doing in industry, which I find really intriguing, especially because so much has changed. I guess maybe that's a good initial question is, is my perception right that like the role of an AI person or a data scientist or a machine learning person in industry like that, the day to day life of that person has really changed dramatically over the past even few years? And I'm curious if the academic side has kept up with that. Yeah, so I think that's an interesting question. I think we need to break it down into multiple sections. Because, I mean, let's just start first, do a quick review of what has happened. You know, because we're talking about the complete transformation of the AI and data science job market. I mean, if you remember, and it was about a decade ago back in 2012, Howard Business Review, they called data science the sexiest job of the 21st century. Yeah, that's why I got into it. Obviously, that describes what I wanted to be. If you think about it, that one phrase, it kicked off a massive gold rush. Everyone wanted it. Universities were spinning up the new master programs overnight. And the promise was pretty simple. Get a degree and learn a little bit of machine learning and you're insensibly employable. That promise feels almost like a myth now. I mean, if you talk with any new graduate today, especially someone looking for their first role, the feeling is totally different. It's brutal. The market is absolutely brutal. We see job posting for entry level. That job requires about three years of experience. The demand has changed. It's shifted fundamentally. It's not about what do you know from the textbook anymore. It's about what can you build, can you deploy and maintain a real scalable AI system. It's kind of like that's the new currency of hiring. I think one time, Chris, I don't know if this was us that came up with this discussion, But I remember quite a while ago, we talked about kind of like full stack data scientists or something like that. The idea being like you could figure out what kind of modeling you needed to do. You could do the prototyping and POC, but you could also like deploy something to actual cloud environments or something like that. I mean, that seems like quite a tall order, Ramin, because you're basically saying like, be a software, like a proficient software engineer, but also be an infrastructure person. And also, and there's this, I don't know, I've heard a lot of people say there's not really like a full stack engineer doesn't really exist. So yeah, is it, I guess, from that perspective, how much of what a data scientist or machine learning or AI person fits into those different buckets at this point, whether it's software engineering or infrastructure work or actual knowledge of differential equations or statistics or something? I think that's also a great point. So if you think about back to DataSense job, the idea of DataSense job was that your job is kind of done once you got a good score in the notebook. You know, that the classic, my model has 95% accuracy on the test data and you're good, you pass it to someone else. And then, if you remember, I think it's around 2020s, with some resources like Google Cloud Rules of MLOps, you know, it laid out these new realities that successful ML needs a homesuit of real engineering escapes. The things like containerization with Docker, CICD pipeline automation, monitoring, and, you know, you have to know if that your model actually works in the real life. and then you need to monitor it. And after you deploy it, you need to basically look for the drifts. So industry made it really clear that job wasn't just build a model anymore. It's kind of like you need to own the pipeline. And then if you think about it, all of a sudden, the analysts or data scientists went from just being a simple analyst to be engineers who build and maintain the intelligent systems. And so just as that engineering bar was being raised by MLabs, Along comes the second, maybe even bigger tidal wave, you know, the generative AI. And that becomes like around 2023 explosion that you can see. Stanford AI Index basically mentioned that this was not just a cool new tool. This was an automation event. You know, I immediately attacked the entry point in the field that they could do those jobs, basically. And, you know, it's kind of sort of this shift was drastic from the data scientists to MLL engineers and all of a sudden AI, basically. In addition to that, there's so much more diversity in, you know, as we were talking a moment ago about the notion of the full stack engineer, especially at the entry level, trying to fit into this. And the notion of like, what is full stack is changing fairly rapidly. There are a lot of different options out there. and not only do you have to try, would that entry level student have to try to fit in to the notion of what an organization is looking for, but there's all these variations on that and if they're not in the right variation of what that organization is looking for in terms of this abundance of skills that are required for that given position, they're still out of luck. I mean, it's really a crapshoot for students today in terms of trying to find the right fit and represent their own ability to fit to the organization that's looking to hire. I'm really glad that I'm not out there in the job market in that way right now. It would be brutal. Yeah, so I think that's true. If you think about it, as this AI wave comes in and this series of automation tasks, basically, AI made certain things simpler. those are like the types of tasks like a boilerplate task that you always used to give to the new hire it's kind of like the groundwork and for someone who's an early hire or recent graduate those type of jobs were kind of like the first step on the ladder for example you write a complex SQL query to get the data, make simple pythons and get your hand dirty with the company's data, you learn about it and also you show your skills but now it's no longer like that. So you need to basically find the correct fit, what they exactly want, what they want to build. So I showed that I can build that. And there was this study from OpenAI and University of Pennsylvania, that they look at this task exposure to large language model, and the takeaway that they had was pretty simple. Any repeatable tasks that used to be given to juniors, I highly vulnerable to AI basically and innovations. So if a junior analyst used to take all this afternoon, write the SQL queries and make the dashboard, now AI can just write it with a great prompt, right? So basically the economic case for hiring a big group of, you know, trainees and have them to do the work and has evaporated, you know, there's kind of like a change. For example, I used to hire lots of interns to basically help with the development and speed up the process. And since AI shift, to be honest, I just use AI for all of those tasks. So this has been this big change. And of course, we are seeing this shift in hiring strategy kind of everywhere in big tech or even in startups. They're just stop hiring for potential and they're starting hiring for proven capabilities. It's kind of like the paradigm has changed. New companies these days basically are forced to bring in 50 juniors or spend a couple of years to train them. They're rather to hire five or maybe 10 people that already have built or developed some complete system from day one. So it's kind of like, if you think about it, that new entry-level jobs is technically what we would call mid-level engineers a couple of years back. You know, shift is really bad. And with this kind of with this new bar, it's not like that, you know, you don't need knowledge. So all this, you know, deep statistical knowledge, Python skills, they're all essential. But they are just, at this point, they are kind of prerequisites. They are the ticket to the game. They are not how to win it. You know, it's kind of, it's has here. You need to prove that you can build the company wants what you built and then you go for a hiring. I wondering because that bar has been raised like you say the kind of mid positions that we used to call mid or maybe the entry ones now How does that change Because I mean maybe this is a negative view that I about to give but I very pro you know higher education But I also think like even whether you look at computer science or data science sort of education, a lot of that does not even before before the recent shift that you talk about, it didn't always connect to what you were actually going to do in your day to day work. Right. So now not only does it not connect to that entry level kind of day to day work, but does it now even increase that divide where like the like how how could we possibly train people to come in as mid level kind of data science folks? Because I think if I'm interpreting what you're saying correctly, it's not that AI is making data scientists no longer relevant or AI or machine learning people no longer relevant. It's still very relevant. It's just the stuff that entry-level data scientists or machine learning people used to do and kind of level up on, that's no longer available. So where are they going to do that? And is it even reasonable for us to think that universities could help get them up to that level, I guess? Yeah, I think so. I would answer to that question in two sections. I think one part is about where is academia stance right now. And then the second part would be talking about the industry versus academia right now. So let's just start with that. Where does academia stance? You know, if you think about it, and I kind of call this, I don't want to be negative, I call it educational bottleneck, you know. And to be clear, the first thing is that, you know, the faculties that we have in CS, ML, data science department, they are all brilliant. You know, they are like world class at teaching the fundamentals, the math, theory, history, the research. That foundation is non-negotiable. You need it. But the curriculums often just stop there. And it used to be also kind of like that. and it's all about the theory and leaves basically this huge gap between what a student learns and what employees actually need for them to do on the first day. As an example, a student might spend the whole semester learning about the math and all sorts of like optimization, back-replication techniques and stuff like that, which is necessary, but as soon as they graduate, they basically see this job market that wants them to deploy on the Kubernetes or they know how to work with all different cloud resources. So they know exactly how the engine works, but they actually never tried to drive a car into traffic. And there was this new post by Andrew Engy recently that he argued this urgent shift in education. I'm going to paraphrase what he said. He said, knowledge is great, but skills are greater. Meaning that in the field that's moving this fast, you have to teach the practical skills to get the work done. You need to give the capacity to get meaningful work done by having a proper knowledge and proper training. So this is exactly what the job market is selecting for now. So that's the view that I have on education at the moment. And the second part that we can basically talk about is like a comparison between where is industry versus academia. and there is a really good study by MIT, a recent study basically that the stats are staggering. They say that right now about 70% of the AI PhDs are just skipping academia and go to job market, go to basically industry directly. And that's a huge brain drain for the universities. And the second is that which is the real killer basically and probably, I'm sure you know about this, like 96% of the major state-of-art systems comes from industry labs, not from universities anymore. So a university is already falling behind and then companies like Google, Meta, Open Your Art, they are the one that's defining the frontier now. They are building the tools. They are setting their standards. And that's the absolute core of the bottleneck. Academy curriculums moves on a cycle of years. Getting a new course approved, like updating a textbook, it's slow. By the time a university approves one new course to be, like let's say, for example, LLM application course to be added to curriculums, the tools have already changed three times. So the entire framework is really different because it took a while. And that has happened to me also. I developed a course and it took years to get approval to teach that course. And then you need to go back and update everything that you were planning to teach because the industry has changed over. Well, friends, when you're building and shipping AI products at scale, there's one constant complexity. Yes, your wrangling models, data pipelines, deployment infrastructure. And then someone says, let's turn this into a business. Cue the chaos. That's where Shopify steps in, whether you're spinning up a storefront for your AI-powered app or launching a brand around the tools you've built. Shopify is the commerce platform trusted by millions of businesses and 10% of all U.S. e-commerce, from names like Mattel, Gymshark, to founders just like you. With literally hundreds of ready-to-use templates, powerful built-in marketing tools, and AI that writes product descriptions for you, headlines, even polishes your product photography. Shopify doesn't just get you selling, it makes you look good doing it. And we love it. We use it here at Changelog. Check us out, merch.changelog.com. That's our storefront. And it handles the heavy lifting too. Payments, inventory, returns, shipping, even global logistics. It's like having an ops team built into your stack to help you sell. So if you're ready to sell, you are ready for Shopify. Sign up now for your $1 per month trial and start selling today at Shopify.com slash practical AI. Again, that is Shopify.com slash practical AI. So Ramin, I love how you highlighted this kind of divide between academia, industry, like what that is in reality. I anecdotally, I remember actually last, I think it was last year or maybe a year and a half ago, I lived by Purdue University. I was like walking through campus and they were just finishing their, they had this new building, right? And so this was 20, whatever, 2024, right? And it said like Hall of Data Science, right? And my immediate thought in my mind is like, in 2017, you could have created a hall of data science. Now you need a hall of AI. You're building the wrong hall. To their credit, I think they actually... So I just looked this up while we were talking. They did rename it Hall of Data Science and AI. So to their credit, they at least caught up with the name. But yeah, I guess like obviously you're an educator and so you see that there is value in trying to have these formal education serves a purpose and is different from maybe on the job training. What do you think or have you seen examples where this sort of practical skills are built up in an academic environment rather than just kind of the theory or the knowledge as you were kind of drawing the distinction there? Yeah. So actually, that's something that we have been doing for almost the last three years. So I basically developed this course, this MLOPS course at Northeastern University almost three years ago that we have been ongoing. So the idea was this. This is like about three, four years ago. I was this hiring manager and I used to do lots of interviews for our team. And I always basically interviewed smart, motivated, you know, good school, basically candidates. But most of them struggled with the same thing. They understood the theory, but they couldn't build anything. They couldn't ship anything. And that's when it clicked for me that, okay, if the industry, I personally as someone who was in the industry and academy, expect these students or these basically candidates to build a real system from day one. And then I know in the industry we don't teach them that. Could we do something about it? So I started working on this course. I built this envelopes course that every semester right now we have about 150, 170 students within one class, like a huge classroom. And instead of just learning the concept, they start by choosing a domain that they actually care about. Healthcare, finance, sport, robotic, and whatnot. Then as a team, they spend the entire semester on building one real product. And this real product, it's not just homework assignment. It's not a toy example. It's a real working system with deadlines, milestones, deliverable. just like a real, like an actual ML and software team. And the best part of that is we wrap up the semester. The way that we wrap up the semester is that the students basically present their product at our MLOps Expo, which is a full industry partner event we have been holding over the last, I think, two years now. This year, for example, we partnered with Google. So we are hosting in two weeks, December 12th, at Google Main Campus in Boston, and where our students are pretty hired to come there, but they would basically, what they do, they show, they demo the actual product that they have built. And so the whole course is simple. You don't just learn ML anymore, we teach you how to build with it. And the idea for me was to give the students this hands-on experience that companies are looking for right now. And honestly, watching the students go from, I have never deployed anything before to me and my team, we build a real product this semester. That's kind of like the best part. One, at least hypothesis that I have here, which I would love your opinion on, Ramin, is on one side, you have highlighted how this kind of gap is widening, even like the between the theory and where you need to come into a job at a mid-level. At the same time, this revolution of Gen AI has been happening which in some ways to your point some of those things are the things that are being automated by AI but it also enabling maybe this younger generation of software engineers AI people to actually perform at a higher level out of the gate but in a different way So not like there's kind of a burden on maybe us as prior generation data scientists and machine learning people to understand that students and new hires need to, from the start, be doing their data science work differently. So just by way of anecdote, we were talking about this a little bit before the show that, you know, my wife owns an e-commerce business. Black Friday, Cyber Monday just happened. I, you know, day to day in my company, you know, I'm not doing as much kind of hands on work on the product as I was given my role as CEO. But it was like it was nice to go back. So for like four days, I helped them during the sale. And I just sat in a room doing like customer lifetime modeling and like updated forecasts for 2026 and looking at churn and analyzing like customer journey and all this stuff. And number one, it was a ton of fun. But I was kind of coming at it from that perspective and kind of reentering some of those things that maybe I hadn't done as much for a little while or even, you know, maybe since the previous year when I helped them with forecasting. Like I was able to get tons of that done so quickly because I was having AI honestly write most of the code for me. The thing, though, was I still had to play the data scientist to get from like point A to point B. There was no way that I could have just said to any AI system, like, hey, I want to write a three-sentence prompt and get out all of the lifetime modeling and forecasting and all of this stuff. I still had to play that kind of data science orchestrator and know what the things were, know what modeling techniques were relevant, know maybe what trade-offs were and other things. So do you think on the one hand, it's maybe depressing that the academic kind of industry gap is widening, but on the other hand, maybe there's, am I right that there's an opportunity to actually like lean in for these students in terms of different ways of working to like get to a higher level faster? I'm not sure about getting to the higher level faster part, but I saw a new talk recently by Neil Ahoy over at Google, and he made a great point about this data science job. And he basically was saying that the data science job is not gone, but AI is just forcing them to change dramatically. It's no longer about analyzing the data or building certain dashboards of dashboards and stuff like that. As we say, you can't just window the knowledge, just prompt it properly, and just having the data and just build that quickly. So there are certain types of tasks that you used to do for trying to climb the ladder, to learn more and more, but that they are not the same anymore. And the expectation is not for you also to do the same task because if a company is hiring you, probably these days, they want more. but I think it is a really great point that for hiring managers or for someone that's when you hire someone on your team or have someone new juniors on your team you need to also account for helping them to like mentoring them properly to be sure that they can evolve and learn otherwise we basically take this cognitive ability from them because they even want if you just ask everyone to just build build and they just use AI they don't they're never going to learn basically how to build So we take that cognitive ability away from them to just build new, faster products. Yeah, I think you're really onto something there in terms of, you know, like one of the things that I have done for the last few years is I'm a Capstone sponsor for Capstone projects at Georgia Tech in the College of Computing. And so as and I'm doing that from my nonprofit role as opposed to my day job. And so when I work with different teams there, I think one of the challenges is they're kind of bringing what they know. You know, certainly Gen AI capabilities have helped them, you know, step up a little bit along the way in terms of figuring it out. I think the areas that I've noticed that they're still struggling, the students, are there's, you know, going back to Dan being a data scientist over the weekend instead of a CEO in that moment, is he's bringing all that business knowledge, you know, years and years and years of business knowledge and understanding about what's really needed in that. And I think that's, you know, that's one of those things that is part of the struggle with junior level is there's the kind of concept of I've learned tools in university, and I'm trying to bring them to bear. And they're not always the right tools for the organization they've joined. And they don't necessarily know how to combine that with all the other tie-ins that that organization may need that were not necessarily something accommodated in their academic development. And so, you know, that's kind of exacerbated by the fact that now with Gen AI kind of replacing a lot of those junior roles coming in and, you know, how do you ramp up? It does seem, to your point, like things are actually getting like, even though we have new amazing tools in, in the form of JNI capabilities, it seems like things are getting harder to bridge that gap. And, and I'm not sure how you do that. Cause it's a combination of both kind of the, the experience of being in the real world, along with fast moving, you know, a fast moving technical landscape to navigate. Are you seeing that from your side with students and, and how are you tackling some of those, subtleties that are there? Yeah, actually, definitely. So two weeks ago, I sent a survey to my students, and I asked them basically to take a couple of questions. And I specifically did it for our talk. And so as part of the survey, basically, there were some questions. And one question was, which is 60% basically of their students, they say that they are taking online courses on top of what they are taking in the school. And another question, 82% of the students say that they're participating in hackathons in order to learn how to quickly to build. And about 46% of the time, they are attending workshops. So they are building their own parallel curriculums through side project, open source contributions, or certification through AWS Google. And that's exactly it. The portfolio kind of has become a new credential. It's no longer about your grades. dislike about what you have as a portfolio. And this is also important for us to kind of like a dose of reality that this self-learning path isn't easy and isn't equitable. You know, it takes tons of time and costs lots of money. And if you want to practice building a real production-grade system, working with a cloud service that always costs money, you know, as those commercials. And how If you think about it, a student is already paying 1,000 intuitions. They cannot also afford $400 per month for cloud computing to practice. So it's kind of like a huge change. It creates this resource divide. And at this point, I think the bar isn't just higher. It's kind of also financially more expensive for the students to learn. And right now, for example, shout out to our friends at Google. They give us lots of credits for our MLOps course every semester because our students, they can't otherwise build anything in the real world. And I personally reach out to lots of providers in the industry. And I say, hey, you know what? We train these students to use your tools. Give us some cloud credits so they can basically learn and build a phone. But yeah, that's my take on that. Well, Ramin, I am kind of intrigued because, well, on the one side, you're thinking in an innovative way about how to bring this kind of skill or reducing the skill gap, being creative in the academic setting to get people these skills. But also, you're a practicing AI engineer. What have you seen kind of personally? Because you're already operating at a higher level. Are there also changes, any like significant changes that you've noticed in your day to day work over the kind of past few years that have caused you to think about your day to day tasks differently? like more so than the entry level type of folks, but actually ways that like you, you're fundamentally thinking about like your workflows or how you're doing those kind of higher, maybe higher skill or higher level kind of data science AI stuff. I'm wondering if anything stands out for you. Yeah, definitely. I mean, I personally have been part of this shift. I started my career as a data scientist. Then like in 2018, I started as ML engineers and it just went up. Then last year, I started as an AI engineer. So I also have been part of this chain myself. Data, ML, AI. Exactly. Same pattern. And for me, when I look at them, they are kind of similar. If you put the data science aside, because that was kind of like there was no production. There were lots of research, especially around it. But when you go to ML and AI, just the terminology is different. They're technically kind of similar. I think the main difference that I personally felt is that I need to, in my day-to-day work, to work a lot with LLMs because it's a requirement for certain things and work a lot with the larger models, which requires you to have a better understanding on, you know, kind of like a GPU optimization, how to break your models and basically ensure that they're optimal, basically. And those changes, you know, it wasn't something that you do maybe a couple of years ago. So I ended up personally trying to read a lot, you know, spend summer time, just read different books to learn, to advance my own career. And I always talk about this with my students. When I learn something new, I bring it to the class. I was like, okay, I was recently basically reading about this. And this was really interesting. This is the link. And maybe I sometimes give them a small lecture also on it. But I think, yes, it's like the changes there for everyone, not just for junior. It like it doesn matter if you a principal or junior technically but getting being more impacted I think that the part that kind of like unfair you know to the juniors or recent graduates I'm curious to extend this out a little bit, you know, as we kind of went from the challenge of juniors and Dan introduced, you know, the challenge of kind of us, you know, as people who are past that point in their life. But like we have fast coming, you know, fast changes are coming even more in the sense of like we're hitting that point where physical AI is really on the rise now. You know, not just in certain industries as it has been historically, but in many industries. It's exploding outward at this point. And we all have challenges in terms of incorporating these new realities into what we're doing and how we're going to learn about it. What does that imply at the university level when you're getting back to students and you're already trying to bridge the gap into the corporate world or the startup world or wherever they're going to be productive? But you also have this explosion in terms of the places that AI is touching in new and different ways. What are the implications on the curriculum and on the burden that professors have to try to get their students ready for that next thing, which is steamrolling over us already? I think it depends. I know some other schools are doing that, but I'm going to speak with respect to Nurtisense. For example, Nurtisense, Curry College of Computer Science, as of this year, basically, 2026, they're updating their curriculums finally. And not everything is going to be a small shift, but gradual, basically. So they are introducing some more practical courses into the curriculums. And also, for example, they are viewing their ethics directly into the coding part of the curriculums. But this is going to be kind of like a slower shift on the curriculum side. But on the other end, from the teaching perspective, and this is kind of like AI is kind of like a double-edged sword at this point because students, you know, they all use AI. They are using degenerative AI, which is great. I would tell my students, you know, use it, but don't lose it. You know, kind of like you need to use it, don't lose it. So it's kind of like you need to be sure that you can learn, move faster with this type of thing, not to just give away all the autonomy and you just basically use them for everything. And then from the other end, from the teacher's perspective, it's kind of difficult because when you give, for example, homeworks or labs to students, especially coding, I'm not talking about writing an essay, like coding perspective, you don't know. You can't even tell that if they wrote the code or not. Everyone returned great codes these days, you know, and then there's the homework. And there's no way for you to just say that, like, if it's written by AI or not. They are really smart in how to change the temperature to ensure that the result is not being detected. So, again, this is kind of like a double-edged sword. But also from the other end, it's like because there are lots of information, lots of changes in the market, in the industry, in the domain. Every day, like every day you read the news, there's a new article, there's something coming out. And it's hard for basically academia to keep up with that. It's like a new academy is falling far behind the industry and this gap is going to just expand the way that it is. And I think at some point, industry need to help academia. It shouldn't be just academy need to keep up with the industry. If the industry needs new talent to come later, you need to step forward. And I say, okay, let me also help them. Let me start some program. in some of the courses that they have. So otherwise, it's kind of like a chasing a ball, like an academy just constantly trying to keep up, and that's not going to win. That's fair. And I think that's a good notion that I think industry really needs to consider as investments back. I agree. I think it's been largely a one-way street there. I would like to flip a little bit the timeline around to the students that are coming in. And I'm asking this selfishly. I have a 13-year-old daughter in eighth grade. We have been applying to magnet schools and things like that and getting her ready for her high school experience. And she has never been someone interested in AI. That was dad's thing and all that. But as she has started looking at what she wants to do, she's starting to recognize that whatever that is, AI will impact her in a significant way going forward. So it's not just the kids that are focused on technology at this point, but all of the kids. And as she does that and they're entering into high school, what advice do you have for what high schools need to do before they come to you? Before you're getting those students and you're trying to prepare them for industry and a career and moving through their lives, you have students coming to you. What would you like to see from high schools in terms of how they prepare these kids to be better or more ready to come into your care as a professor so that you can do the thing that you do? Yeah, so I think that's a great point. And there are already two shifts. I have been spoken by neighbors, similar question that, hey, my kids, should they go back, go to college for computer science anymore? Should they study this anymore? And I think the answer is that, yes, you know, there will be shifts in the market. And it's not just computer science. It's not just AI. AI is going to impact so many things. Some areas like it's slower, but some areas much faster. and at some point all of us basically become somehow we need to learn how to work with AI and I think it's really good if from high school you understand the concept not not maybe the math the theory behind AI but just to learn okay in general what how does AI work there are lots of AI capabilities that you don't think you need the math behind them you can just build a system just by knowing how to put the components together so if they could like from high school go to part of the workshops or participate in some sort of like a training stuff, build something simple, you know, that automatically opens lots of doors, like a thinking process for you for the future. As you go to like after high school and you want to go basically to universities and you learn in different courses, different concepts, you're like, oh, I know maybe I can build something around this. You know, I always think that everyone can be an entrepreneur, you know, It's kind of like as long as they have the correct mindset and the energy for it. So if they already have been trained from high school and they have not trained in a bigger way, just kind of easier way of training, like teaching, they could potentially advance more in university in compared to students that they just want to learn during the university. Well, I know that we've talked a lot about kind of a lot of perspectives, both from the industry side, from the academic side. I think all of us on the call, though, are generally excited about kind of certain parts of the ecosystem, the way that they're developing. from that side of things as we get closer to the end here, Ramin, what as you look at the ecosystem, because you're again, you're you have multiple views of this ecosystem from the industry side, from the academic side. What what's most exciting for you as you're kind of entering into this next year? And maybe it's something like, oh, I can't wait personally to, you know, have the time on a weekend to explore this or maybe it's something you're you're already getting into but yeah definitely i'm actually uh i recently purchased the richie mini but yeah yeah yeah yeah the robot right the little uh it's kind of a desktop type robot yeah yeah so it's uh i'm so i'm pretty excited and waiting for that to be delivered i think it's the delivery is going to be early january hopefully finger crossed and I'm pretty excited to work with that and build some capabilities I have in mind and when I think about all these changes like if you would put me back a couple of years ago I would have never gone for robotic oh my god now you know what it's not my time but now with this AI change and I already went through the you know contents of uh on hugging face which is these guys are great but reading it through the documentation I was like, wow, that's pretty straightforward. So think about how much AI or change the field that I can easily go by a robot like a small robot and I'm planning already ahead of time. They also have this simulator so you don't need to wait for it to deliver. You build ahead of time the apps and simulate it that it will work on the robot so when the robot comes to deploy it. So that's my go-to like what I'm excited for in 2026. Yeah, it's kind of crazy. I feel like when we started in this field, it was like hard enough to get the dependencies installed for TensorFlow and just be able to run any model. Just like that in and of itself was like... Are you trying to give us PTSD? Is that the goal here? I mean, TensorFlow and CUDA. Yes. Yeah. Yeah, it's like regardless, that was the hardest problem. And now you can have a whole digital twin of a robot and do all that. It is pretty spectacular. Well, I'm also excited for that. I think we do have one coming here to our offices as well. So I'm excited to see what that's like. I've never done any robotics, really, other than maybe those, like, what are those Lego robotics sort of things. But yeah, excited to see where things are going. Thanks for sharing some of your insights with us, Ramin. It's been a real pleasure and hope to have you on the show a third time to let us know how the robotics went. Yeah, I appreciate that. Thanks for having me again. And it was great. All right, that's our show for this week. If you haven't checked out our website, head to practicalai.fm and be sure to connect with us on LinkedIn, X, or Blue Sky. You'll see us posting insights related to the latest AI developments, and we would love for you to join the conversation. Thanks to our partner, Prediction Guard, for providing operational support for the show. Check them out at predictionguard.com. Also, thanks to Breakmaster Cylinder for the beats and to you for listening. That's all for now. But you'll hear from us again next week.
Related Episodes

Beyond chatbots: Agents that tackle your SOPs
Practical AI
45m

Technical advances in document understanding
Practical AI
49m

Chris on AI, autonomous swarming, home automation and Rust!
Practical AI
1h 37m

Beyond note-taking with Fireflies
Practical AI
48m

Autonomous Vehicle Research at Waymo
Practical AI
52m

Are we in an AI bubble?
Practical AI
49m
No comments yet
Be the first to comment