February 7th, 2022 × #javascript#servers#serverless
Pros + Cons of JavaScript Servers, Serverless, and Cloudflare Workers
Discussion of pros and cons of different JavaScript server options like traditional hosting, serverless functions and Cloudflare Workers.
Transcript
Announcer
Monday. Monday. Monday.
Announcer
Open wide dev fans, get ready to stuff your face with JavaScript, CSS, node modules, barbecue tips, get workflows, breakdancing, soft skill, web development, the hastiest, the craziest, the tastiest TS Webb development treats coming in hot. Here is Wes Barracuda Boss and Scott
Wes Bos
key.
Scott Tolinski
Welcome to Syntax. In this Monday, hasty treat. We're gonna be talking about JavaScript servers, serverless, and CloudFlare workers, some of the pros and cons of each approach to hosting and running your server side JavaScript.
Scott Tolinski
Now my name is Scott Talinski. I'm a developer from Denver, Colorado. And with me, as always, is Wes Bos. Hey, everybody. Hey, Wes. This episode is sponsored by 2 amazing companies, one of which would be a great place to host your blog, and another one would be a great place to host your own node server or go server or any type of server. I'm talking about hash node and Linode.
Scott Tolinski
Did you realize there's no 2 nodes? Yeah. Hash node and And we're talking about node today. Hash node, Linode, talking about node. Yeah. So, Hashnode is basically everything you need to have to start a blog. Hashnode.com, and it's really super cool because, Well, one of the things that we always talk about is the ownership of your content. And what I love about Hash Notes approach is that they really push the ownership of your content.
Scott Tolinski
You can obviously use your own URLs to link to your own site.
Scott Tolinski
But more than just that, you actually write your your posts in markdown, and you own all of the data. It's extremely easy to export all of the data. You can have it, backed up to GitHub.
Scott Tolinski
So each time you publish an article on your blog, a markdown version of it is pushed to a GitHub repo as in, Hey. Even if this blog goes away, you'll you'll never lose that content because it seems like you're never gonna lose access to your GitHub. Right? Built in analytics newsletter, There's built in newsletter stuff, HTTPS by default, then we're worrying about SSL certificates, edge caching with SSL.
Scott Tolinski
So, this is all really, really super neat. So give HashNote a try, especially if you're looking to start up a blog, new year, new you. Gotta, Try something new. Start start something different, and maybe, create a blog on Hashnode. So give it a try. Let's talk about another node company, Linode.
Wes Bos
They are the cloud computing developers trust.
Linode overview
Wes Bos
Specifically today in this podcast, we're going to be talking about, Linux servers to host your JavaScript, your node environment, and Linode is a perfect place to do that. So you sign up. They're gonna give you a $100 in free credit, and you can use they have all kinds of services. Just the Linux servers that host node projects is just one of the many, many different services that they offer.
Wes Bos
But this is one that I would encourage you to check out after today's a episode because we're going to be talking about the different places where you can, you can put your JavaScript. So check it out at linode.comforward/syntax.
Wes Bos
Throw your next node. Js product or throw any product. You can run Ruby or or go or you can run literally anything that you can run on Linux, which is everything on Linode. It's awesome. Check it out. Linode.comforward/
Scott Tolinski
syntax for $100 in free credit. Thank you, Linode, for sponsoring. Sick. Okay. Cool. So this actually, this entire episode as some of our our hasty treats have been started off as a a pollock question, and the pollock question was one that required a deep enough answer or could really facilitate a deep enough conversation that we could turn it into a whole episode. So, to kick this off, we we got a question into our potluck that says, I'm new to MERNSTACK.
Scott Tolinski
What what is the MERN stack, Wes? It's Mongo, Express, React, and Express, React, and Node. Node. Okay. You got it. You got it. Okay. I remember the mean stack. I never I never I guess I did use the MERN stack. Yeah. The the mean stack was like the mean stack really popped because, like, what a sweet name that is. You know? Yeah. And every at that point, like, every stack had, like, a name. It was, like, really trendy to name all your stacks.
Scott Tolinski
Yeah. You know, I think you don't see that as much anymore. Yeah. Yes. Alphabet Stack. Yep. I'm new to the MERN app development, and I was wondering When you use a framework like Next. Js, are there trade offs to building the API in the Next. Js forward slash API folder first creating a separate back end outside of Next. Js land. So what we're gonna be doing is using this as sort of like a a jump to kick off talking about, well, we have, like, typically with node servers that you would run via probably on a Linux box somewhere.
Scott Tolinski
Your nodes, your digital oceans, those types of of systems, and then you have serverless functions, which are kind of a newer way of having server side code. And then we're also gonna be talking about Cloudflare Workers.
Wes Bos
Do you wanna kick it off? Yeah. Yeah. So this is, like, a good question because I the other day, one the APIs that I use. It's called the recipe puppy, and we use it in one of my courses.
Recipe Puppy API issues
Wes Bos
It went down. And this is like I'm starting to realize I shouldn't use third party APIs in my projects because then that one lesson doesn't work when that thing doesn't. So what I did is I said, alright, you know what? I can recreate the entire API. It's very simple. You you send it a search like chicken, and it brings you back a list of an image, a list of ingredients, a description, and that's I think that's about it.
Wes Bos
And we just display it on the page. Very, very simple. So I was like, okay. You know what? I can make I can make the same API.
Wes Bos
I'll just sit it on top of, like, all recipes or something like that, but I'll just make it so that the API works exactly the same way. The only thing you have to change is the actual URL that you're hitting. So I was like, okay, what do I build this in? Do I build it in a traditional Askify or Express or Koa. Any of these, like, traditional for and I I say traditional additional not to mean that they're old, but that's just the kind of the way that we initially started building apps in no JS and still is very popular way.
Wes Bos
So I could build it in a an app like that and then host it on some server and run it forever.
Wes Bos
Or I could run it on serverless functions. And we'll talk about what those are and pros and cons to it. Or my other option was I could host it on Cloudflare Workers, which is, like, kind of another serverless option, but it's a it's kind of its own beast.
Wes Bos
So I kind of just like Emden had through them. And I thought, like, why don't we do a show to answer this question on the pros and cons of choosing Either one. And and, specifically, the question was here. It's like, if I'm building, like, a whole bunch of back end logic, like, what do I put it in the Next. Js API folder or do I, like, make my own folder that has it? And the answer at the end of the day is most of the stuff will port From one thing to another fairly easily. The stuff that is specific to each of these frameworks, whether it's Fastify or Express on a traditional hosted server or whether it's Next. Js or begin or, serverless or what are the what are the other? There's so many different options out there for for serverless side.
Scott Tolinski
What are the else what else is there? I don't know. I always just think about AWS.
Wes Bos
Yeah. It's like They're most of them most of them are built on top of AWS or one of the other big players, Google, Microsoft, whatever their their serverless offerings. AWS is by far the biggest player in the space. Netlify functions.
Wes Bos
Yeah. Netlify functions are built on top of it.
Wes Bos
And in most cases, they just use the Lambda API.
Wes Bos
Sometimes they have a custom API built on top of it.
Wes Bos
And essentially, it's just you get a request, you do some stuff, and you send the result back. The getting the request and the sending the result are sometimes a little bit different from platform to platform.
Wes Bos
Like, even even in traditionally hosted express, it's a response dot JSON. Whereas, with, what is it? Fastify is just, like, response or reply dot send. You know, it's like a little bit different, but, like, at the end of the day, most of your code is being written in functions that do things like database queries, looping over data, reducing stuff, database queries, looping over data, reducing stuff, rendering out templates, and then you send it to the thing. So I I would say like 80% of your work is actually done in the middle, which should not change much environment to environment.
Scott Tolinski
Yeah. Yeah. It's interesting because I I mean, I run level up tutorials. Our API is a separate node application.
Scott Tolinski
But, yeah, it could just as easily be an endpoint From an endpoint route in SALT kit because you're just collecting that initial endpoint, and then you're handling it the same way. The the hardest part again is just knowing any sort of the weird quirks and features of any individual platform or having to deal with. I I mean, I I think when you have, like, a traditionally hosted server, it's really easy to just not have to worry about anything. You have an application that's up and running. But when you do get into the service serverless world, you do there are some things to worry about whether that is the size of the application.
Scott Tolinski
Yeah. Because if you have, like, a GraphQL endpoint, it's one endpoint that's your entire application, really.
Scott Tolinski
I think there's, like, some startup time stuff in involved in on that. But then again, I don't work that much with serverless, so I'm sure you can speak much. Yeah. Much more educatedly about it. Alright. Well, talk about pros and cons. 1st one, traditionally hosted Linux server.
Disk access pros and cons
Wes Bos
The pros and cons. So pro, pretty easy to use. You get a box.
Wes Bos
You generally, you SSH into it, which is kind of like logging into the terminal of the actual server.
Wes Bos
You get full control over that server. You are responsible for installing everything yourself.
Wes Bos
Often these hosts will sort of add a layer of help on top, which is like things like like a couple of years ago, there's a huge thing, like heartbleed. And they they would upgrade the servers for you, but they're not gonna They're not going to upgrade your PHP version for you. They'll give you an image that has all of these things installed. Like they'll probably have a MERN image that has MongoDB already installed or, for sure has Node. Js installed and npm installed, all that good stuff. You. Full control over the server. You get this disk disk access, which is a pro and a con, which is great because if you just need to save something like an image.
Wes Bos
You can just save it to the folder.
Wes Bos
And then generally with serverless, these Our, like, transient environments, meaning that with serverless, they start up. And anything written to disk, you can do it. But by the time that thing spins down, meaning that it turns off.
Wes Bos
Whatever is on there is gone. So you can't put, like, long living things on there,
Scott Tolinski
including logs. So if you wanna just have some simple logs, usually, the the service will give you some logging. But if if you need to have more in-depth logs, you need to send that data off to a third party logging service. Yeah. So that means, like, if you're doing any kind of caching, you'd probably want to do so in an external service, like a a Reddit service or something because what that. Yeah, that caching is only going to live in the life of the memory, which again, when that is shut down
Wes Bos
Here goes your memory. Yeah. Like a perfect example is often people will save, like, sessions IDs into memory, which is not a good idea because that will eventually run out and your server will crash. That's your. But a lot of people still do it and they can clean it up and whatnot and that will be gone. So that's why a lot of people use Redis in order to save stuff that needs to be transient or across. Like, Here's my next point is that like a con of this is it's a little bit harder to sort of scale up. So most of these services will allow you to increase the amount of memory or disk space or CPU or any of these things that you need to allow you to sort of drag and drop these sliders and and get more of that. But at a certain point, it makes sense not to just put have a bigger computer, but to have many, medium mediocre computers that handle the requests on each and getting into that. And you get into load balancers. And then it gets a little bit complicated, especially if you're trying to share things like Sessions across all of the different ones. So Mhmm. That's a con. Any other cons you can think of? The that's the biggest one to me is that it's the scaling is a little bit trickier than it was with some of these other options. It just feels more complex. I think there are more
Scott Tolinski
Things you have to know about in the platform.
Wes Bos
Yeah. SSH.
Wes Bos
SSH. If you wanna if you wanna SSH search, It's much easier than it used to be, but you still have to go in there, and set up your SSL certificates. You got to set up your own. Like, I run engine x on mine.
Scott Tolinski
We do need a seller. SSAH
Wes Bos
SS. What did I say? H SS. Oh, SSL. You're right. Thank you. I was like, Wait. Okay. You can't. Yeah. Okay. So you got to provision your own certificates, all that. Again, that stuff is is much easier than it was even 5 years ago, but still it's it's up to you to sort of take care of that stuff. And a lot of people are just expected to work, which is when you go with a service, they'll do that for you.
Wes Bos
Next one we have here is serverless functions. So, the pros of serverless is that or maybe we should just explain really quickly. Serverless is that Instead of having a server that handles your entire application, you have a every single route that you hit is a good example or every single piece of work that needs to be done. Download a CSV, resize an image, query a database, bring back some JSON.
Wes Bos
All of those things are functions. Right? They they do something in response to a request. They do something and return a result.
Serverless functions overview
Wes Bos
And the whole idea with serverless functions is that you only pay for what you need. So if you've if you need to Make a CSV of out of a database.
Wes Bos
Someone hits that URL, you query the database, you do your work, You make a CSV and then you send it back to the the user. And then as far as the serverless function is done, it is concerned. It says, I'm done. You might need me in 3 seconds. You might need me in 3 years, but I'm going to sleep. Right? And that's not Entirely true because they do stay warm for 15 minutes and whatnot, but that's kind of the idea in that They got a blanket on. Yeah. Exactly. They put a blanket on to you. Like, you know what? I'm gonna I'm gonna hang around for a couple minutes in case you need me again.
Wes Bos
And then but also on the flip side is, oh my gosh.
Wes Bos
10,000 people need CSVs right now.
Wes Bos
Holy crap. Let me get some of my friends in here. Hey.
Wes Bos
Like, let's say, my website and Scott's website were both hosted on serverless functions.
Wes Bos
And let's say my website got really busy, but Scott's website wasn't doing much. He'd be like, hey. Scott's website could use a little bit of help. And Scott's little serverless functions would run over and and say, hey, I'm happy to help out. Let me do a little bit of that work.
Wes Bos
Everyone would do the work, and they go, oh, wow. That was a lot of CSVs I had to generate. And then they'd all put the blanket on and eventually fall asleep. Right? Is that makes sense? Sounds sounds right to me. I like the metaphor.
Wes Bos
You're only paying for the time that they are actually doing work.
Wes Bos
It can be very cheap to run stuff. And this is a good example of that little API.
Wes Bos
I didn't really want to pay $5 for the rest of my life to host that API.
Wes Bos
With serverless functions, it's It's not that it's not going to be that popular. Very, very simple. It's going to be very, very cheap, if not free, to run this thing indefinitely for the rest of my life. What else? It's pretty easy to move over. That's from other serverless stuff.
Wes Bos
The cons of that is it requires you to rethink how you do some things. So you don't really start your app. That's a big one for people. They're like, okay. Like, I got my app on on Vercel or Netlify.
Wes Bos
How do I start it? Right? There there's no starting it because it starts itself when a request comes in, and it stops itself when you're done. Right? I always wondered that. So okay. So I don't like I said, I don't do much serverless. So you're connecting to a DB
Scott Tolinski
database. Right? You would have to reestablish Or or establish a connection to that database.
Scott Tolinski
The when the request comes in, that's what happens. It's like request comes in, and I establish connection to the database, Then I do work that's in the database. Then I disconnect. Yeah. And there's if you think about, like, a traditional database, like MongoDB,
Wes Bos
There is some approaches that people take, that is called connection pooling, where you sort of keep a connection live, and then you share that I haven't specifically done with MongoDB, so I don't know if I can talk super well to it. But that's the approach because, yeah, then that's kind of like, Oh, it takes, like, a couple milliseconds to connect to the database. In some of these serverless functions, they expect you to do everything in 50 milliseconds or less. Right? Yeah. And if you if you can't go over that in some cases, so it's not a good case for that.
Wes Bos
And that's why sometimes people will go for a new a whole new entirely database approach as well when they move over to serverless. Not to say it can't be done with MongoDB. Certainly even MongoDB themselves has a serverless offering.
Wes Bos
It just takes a little bit of a different approach to handling that.
Wes Bos
What else? No long running processes. So, if you wanted to have a process that, was running for 16 days or whatever.
No long running processes
Wes Bos
These again, these processes start up when they're requested data, and they shut down when they're done. So if you have a long running process, like an example would be responding to, Like, event listeners on the Twitter Firehose API. Right? Like, that's that's not a good use case for serverless functions because you would need to constantly be listening for something.
Wes Bos
But like a webhook would be a good use case for serverless functions because it's it's specifically pinged when somebody hits that URL.
Wes Bos
Some packages don't work in serverless environment because they require, like, native low level bindings.
Wes Bos
Like a big one for a long time was Phantom JS.
Wes Bos
And there are ways to get it to work, but a lot of these times, these, packages that just work on node don't necessarily Work a 100% in serverless environment because of they're trying to keep a lean environment, and they don't give you access to an entire Linux server when you're doing this.
Wes Bos
And the last 1 we have here is some packages won't work because they are simply too large.
Wes Bos
So if you think about, essentially, what a serverless function does is it tries to zip all of the stuff you need, and you can you can Actually, run an entire express server in a serverless function.
Wes Bos
But at a certain point, you're going to have too many dependencies For that entire website to run, and it will be too large to fit into a serverless function.
Wes Bos
So the idea is that you you sort of break up every route or every every function into its own serverless function, and it will only include its things that it needs to to work. Yeah. Actually, it's fun. It's all kit just got server side code splitting for that purpose, for that explicit purpose. Oh, yeah. That's That's really handy because some of them make you explicitly tell you what package you need for every single serverless function.
Wes Bos
So, like, I don't know. You've got, like, a large website with 60 routes.
Wes Bos
You got to have a 60 package JSON. That sounds like a nightmare to me.
Wes Bos
So a lot of them will do tree shaking. Well, they're just like, oh, I can figure out what you need, and we won't bundle the rest of it. Totally.
Wes Bos
So third one, Cloudflare Workers, which is sort of a beast of its own. I've used it, I think, 3 times now. Have you ever have you dipped your toes into this yet, Scott? I have done no toe dipping into Cloudflare workers No toe dipping or serverless functions. It's it's wild for me that I I've Haven't done either of these, but, yeah,
Scott Tolinski
from from my understanding, let me give a little little, dumb guy outlook on this. So Cloudflare their workers essentially something that is a let me try to get this and then you can correct me. Is it is it a serverless type of function that just runs closer to the edge, essentially. Yeah.
Wes Bos
So you hear it. Maybe we should do an entire show on what is the edge. I know. I know. Right? Because you you hear that a lot. And yeah, so that's part of it in that it runs it will actually run on the actual server that is closest to the user. But Cloudflare has its own version of JavaScript, which was wild to me when I found that out. So, Cloudflare workers Don't run Node.
Cloudflare Workers use own JS
Wes Bos
They run their own version of JavaScript. I'm not sure what engine it is, but they have been developing this over the years.
Wes Bos
And it's okay because it's just JavaScript, but it's not node and JS at the end of the day.
Wes Bos
It's the API for it is almost exactly the same as web workers. So So if you were to imagine that you are running something in the browser and you send off some something to happen in web workers, you're in web workers are a bit weird because, like, you're not in the browser.
Wes Bos
You don't have access to the DOM or any of the browser APIs, but you also don't have access to any of the Node. Js APIs. You just have access to JavaScript APIs and then an additional set of webworker APIs that help you do things inside of that webworker.
Wes Bos
So I'll go through some of the cons of it because they have most of the pros of serverless as well.
Wes Bos
One of the weirdest thing to me, and this is the thing that tripped me up the most, is that, if you just want to use like, you you all to use it. You npm install your dependencies and you can use your bundlers and all that stuff that you want.
Wes Bos
But any package that has a dependency on Node. Js APIs. So anything like FS or any of the performance APIs or there's a whole bunch of them.
Wes Bos
They won't they sometimes will work and sometimes will not work.
Wes Bos
And Cloudflare put this, like, website out called Works With Workers, where they say there is 20,000 packages an NPM network with Cloudflare Workers.
Wes Bos
Here's 55 of them. They're just like 55 random packages.
Wes Bos
And it's kind of like you kind of look at it, like, So, like, almost nothing. Like, if you search any package, you would imagine you would you would hope for it to work.
Wes Bos
It's probably not on there because there's 55 random APIs. So I don't think that was a very good, like, marketing choice on their end to launch that, launch that website.
Scott Tolinski
Some of that stuff does it scares me. I mean, it it that that kind of that kind of thing is directly the type of thing that would scare scare me, because I would say,
Wes Bos
do I wanna get into this? So this is it? Then they say 20,000, but they only list 55. So, like, what are the other? White, yeah.
Wes Bos
950.
Wes Bos
So how do you know? You don't until you are frustrated that some you get a random cryptic message, error message.
JSDOM doesn't work in Cloudflare
Wes Bos
And it used to be that you couldn't even run your your workers locally because it runs in its own environment. So they're coming. They made this thing called Mini Flare, which is pretty sweet, and it runs on your computer and it replicates the Cloudflare Worker environment. But, you you know, like, that's always a bit of a bummer when, like, you try to replicate the environment as close as possible. Then you get into those, like, well, it works on my machine, team. But when I deploy it after I wait 6 minutes for it to to build, it doesn't. So that's a bit of a bummer as well.
Wes Bos
I specifically ran into this with, that. So the the recipe puppy API that I made with this is I was using JS DOM just to scrape a page, pull the recipes off the page and to render it out to an API. I mean, I was using JS Dom for that because I just like I quickly like I wrote it in node and it worked. I'm like, okay, good. Now let's connect it to an API. Let's try Cloudflare Workers. And then like And then it didn't work. And then I found out it's because JS DOM doesn't work in Cloudflare Workers because it it requires, proprietary node APIs.
Wes Bos
So I found this one called Linkdom, which has a worker which is meant to run-in either web workers or do you know it would work in as well? Or in my case, it worked on Cloudflare Workers. So I was happy about that. But I feel like you will be going if you decide to go all in on Cloudflare Workers, You'll hit a lot of those initial bumps and scrapes in terms of picking your tech. But for a lot of people, that is well worth it. It's the same thing with with, the early days of Lambda, like serverless functions on AWS, because there's a lot of, Like, bumps in, like, oh, but those limitations are nothing in compared to the price and performance that people get in return for running your app on Cloudflare Workers. Interesting. So okay. So
Scott Tolinski
now that we have, I guess, that, Like, the the whole baseline here.
Scott Tolinski
Mhmm. Just so we don't go over crazy amount of time. When we're thinking about this stuff as it pertains to the initial question, like, Should I put this in my API folder, or should I run it on my own? And if I do put it in my API folder, like, What essentially does that mean in terms of of your use case? And for me, if I was wanting to get into serverless functions, Or Cloudflare workers using something like Next. Js or something like SvelteKit or any of these things where they have their, That stuff kind of obfuscated for you. Seems like the way to get into it.
Scott Tolinski
I'm wondering even as a As as the same as this user here. Should I move my API into a Cloudflare Worker setup or something like that within SvelteKit, Or should I keep it as a separate API? Like, what what is your call there, Wes, if you were making that call?
Wes Bos
I think that regardless of which approach you take, 80% of the or 90% of the meat of your application, What it does should be in a lib folder somewhere.
Separate code into lib folder
Wes Bos
It's just functions that do things.
Wes Bos
And then the remaining 10, 20%, which is How do you take the request and send the response that needs to live inside your API folder because that's how you actually handle the response.
Wes Bos
So It's not about just put a 100% of your code inside of the request handler. It's do 80%. And that makes again that other benefits to that. It makes it more testable. It makes it more reusable.
Wes Bos
You can move from one to another pretty easily.
Wes Bos
So there's there's that as well.
Wes Bos
So what was the question? What should you do? What should you do? So I think that and this is how I approach a lot of my things that I'm building quickly is just, yeah, like write a lot of your code separately and then then adapt it to the request and and the response that you have there. And that will make you much happier Because when you realize, you know what? This is not going to work in workers or, oh, I wrote this for Express Server, but I now want to host it on, Vercel because it's going to be serverless.
Wes Bos
The move from one to another is not going to be as bumpy as it might have been previously.
Scott Tolinski
Interesting. Yeah. Cool. I mean, like, I mean, I agree. It's there's so many, like, there's so many Nice to tease now that exist to avoid having to get into the difficulty of these things, whether that is using begin architect or, any of these modern systems that that kinda hide all of the complexities of these things. So I don't know. Yeah. It it's a tough call for me. I I should maybe even just, like, dive into some of these things a little bit more and gain some experience there. I I'll also say one more thing is that
Wes Bos
The benefit to going serverless first is that it's it's very easy to host a lot of these projects. Often, you can find places to do it for free.
Wes Bos
And that if you're just goofing around, you don't necessarily have to to worry about, Should I keep this thing online? Are people even using it? Am I paying for this to host it? You know, like, all of that, stuff. It's it's a little bit easier, I think, in serverless function land, especially if you're starting from scratch and you're not trying to port something over. So there's there's that as well. But that. That said, Scott and I both host our course platforms on the 1st version, which is traditionally hosted Linux servers, and
Scott Tolinski
I've had 0 issues doing that. Yeah. That's the whole thing. Right? It it's you know what? I I do like that. It's also very, very cheap to do it. Like, it's also my entire business on Couple $100 a year. Yeah? Yeah. Totally. Especially modern servers. Like, we're on render.com, and that is very cheap for 1 single container of a node Node site. You up it just a little bit, and you're still getting really good performance for a couple $100 a year. And that Yeah. Yeah. That that node server for me, then one of the cool things too with that that we maybe didn't touch on is that it's it's easier to version and release new updates just for the API If it's in its own self contained thing in a maybe not not necessarily traditional burst for serverless but like verse in in the API and the next JS site verse having its own thing. It now becomes something else you have to manage, but you can manage it, and it gives you a little bit more control there. So
Wes Bos
Oh, trade offs. Trade offs. Trade offs. Try offs. So hopefully that answers your question for what should you do. The answer is it depends as always, and I think that's it. Yeah. Thanks for tuning in. Peace. Peace.
Scott Tolinski
Head on over to syntax.fm for a full archive of all of our shows.
Scott Tolinski
And don't forget to subscribe in your podcast player or drop a review if you like this show.