Gert Franz talks about “Cool Lucee CFML (GigaBytes file parsing and more)” in this episode of the CF Alive Podcast, with host Michaela Light.
Read the show notes and download the full episode here
Michaela Light 0:01
Welcome back to the show. I'm here with good friends, all the way from Switzerland. And when we talk about cool new stuff in Lucee cfml, including gigabyte file parsing and asynchronous logging to the database and mail listeners, query listeners, all kinds of cool things you can do in Lucy these days and catch up with what he's been up to. So welcome, Gert.
Gert Franz 0:38
Hello, and thank you for having me back. Because you're so right, yes,
Michaela Light 0:42
it's been nearly a year it's far too long. I mean, so many things. So if you don't know who God is, he's heavily involved in the Lucy open source ColdFusion project. And since he was an astrophysicist he just decided he must do something more complicated and decided to start writing a cold fusion application server.
Gert Franz 1:07
Well, just to correct you. I was never an astrophysicist even though I studied that. But
single day, I never worked a single day as an astrophysicist. Only maybe in my free time when I pull out my telescope. That's okay, maybe go back to those days. But I'm a programmer by heart. And from the bottom of my heart since the last day I was in uni, or even before that since I was five years old, started with basic. Wow. I remember even the day the first day that I saw a computer program that was in 8584. It was an 84. But I didn't remember the day because I was watching over the shoulder of a guy and he was showing me something complex and what it was actually doing and I was flabbergasted. And the first thing that I remember in 84 that I was able to Was I managed our text printer to print out graphics, like graphics not being consistent just to have wild characters, but what you had to do, you had to modulate the characters and kind of plot them there. And we I was able to do some kind of charts with that. And I was very, very proud being 16 or 17 years of age. And wow, coming on. Snyder that was
Michaela Light 2:25
only a few years ago. Now now 21 right.
Gert Franz 2:29
I'm 21 Yes, you know, I hope I never grow up
Michaela Light 2:31
in hair. Right?
Gert Franz 2:35
I am in the meantime, I'm 34 right. So that is 34. Nice. Again, nice again, but still in my my head. I'm 15 hex, right. Is it 15 I guess 21
Michaela Light 2:51
Yeah, so you're you're in charge of the development department at distro kid which is a you know music distribution company. Yo, WE cool kid because not only do you program in Lucy, but you also work at a music company so
Gert Franz 3:07
well, you know, it's always so funny when you're in the meetings with these cool kids from the creative department, they sometimes sit in the meetings with their guitars. And when they're on mute, you see that they're playing and even maybe singing and listening to whatever we're saying. So that is really very cool. And, and I can tell them, well, you know why you're talking. I'm just typing a cool program or whatever, that's not cool or drawing something. But it is a lot cooler. If somebody asked me, well, where are you working? Now, while I work in the music industry? That sounds a lot cooler. Right? But still, we're programming. We're programming fully with Lucy and I really love this job. It's hopefully the last job that I'll ever have. And I haven't said that about any job before. Right? And I still consider it being my dream job after two years of being districted now, and I took the team for from being two guys to now 16 developers,
Michaela Light 4:10
Gert Franz 4:13
game developers, and many of them are very, very well known. Mark drew for one, Matt Giffords, an Earth, Emily Meyer, formerly Christiana sun. Michelle, obviously Eagle, Sophia Michelle, being one of the core developers of Lucy, and Eagle, Eagle, one of the core developers as well, as Lucy. I can tell you honestly, it's not always easy, because you know, it's herding cats. When you have people that are when these developers are really very, very great developers, you know, every one of them has written a book, right? And I mean, all just being able to work with these guys. But sometimes it's like that when you deal with people that aren't usually in 99% of the cases, right. And now they're discussing about an issue. Who is who of them is right? Thank God we didn't end up in a in a fight yet. That's still to come. But what is great, nobody is the smartest in the room. Right? All of them are everyone is smarter than you on a particular subject, maybe some subject, you know something about certain things, but always you can learn something. And that is really, really awesome. That is what I love about this job so much is that I'm not at the end of the line. I can still learn stuff. A lot of stuff that I can learn there. And our CTO who is not really a Lucy, a guy with Lucy background, he decided that we're going to go full stack Lucy, right, we had nginx now and we had no js and PHP in the mix. Oh, but now we're slowly phasing them out, which is great, and it's going to be fun. Lucy from A to Z.
And wow, this is
really cool because now everyone could fix any part of the whole system. We're using things like Bitbucket and pipelines and using Lucy cfml with hortus command box in the pipeline deployment scripts, which is very great. Thanks. And kudos to mark who set that up. And it's really every deploy is one click away. And we're going we're doing test driven development and every feature goes out, never goes out on tested things like that, and distribute itself as the largest music distributor in the world. So more than 50% of the music goes, I suppose over looses servers, right? And that is really very, very impressive and I'm very, very happy and very proud of, of being part of this cool team. Especially because It's very chilled and very relaxed. It's not pressure. There are no, there are no egos in the mix. Well sometimes but I mean, this is what happens when you deal with such famous people. But having the opportunity of using Luciana, its full power at districted is really something very even relaxing, I would say. Because it shows me that the full potential of Lucy even dealing under pressure of having I don't know sometimes we have even a couple of hundred users at the same per second on the servers. And the servers don't really choke about this. Like everything runs on Amazon AWS machines and worse having auto scaling in place, which is of course,
Michaela Light 7:49
how many servers are in the cluster.
Gert Franz 7:54
Something between 10 and 40 or whatever depends on on how much going on under, under system.
Michaela Light 8:03
So it also scale scales up to more servers when you need if needed, we usually don't need that many because it's just Lucy being so performance, right?
Gert Franz 8:14
And if you're, if you have a slow website, you can just say your shop is big just because you have 40 servers running. You know, for me a page that is slower than that say two, three seconds is not what is worth, performance journey. Right. And maybe as you know, from my past, I am the guy who does the most of most stuff and performance tuning especially in Lucien database.
it is a little frustrating because you know, if the pages are fast already, what do you want to tune in to tune? There are some pages that we need to improve a lot. Yes, we're working on those but those are mostly larger, larger changes that we need to need to make to the system. I've done a couple of performance talks in my past. And one of the main important things about performance tuning is performance tuning, yes. Next to doing all the usual mistakes of using select star or having the wrong indexes or having all let's assume that everything that you have done is covered by the usual performance recommendations that you have not using too large lists and what have you. But once you have done then, that's when the interesting part starts. And what one thing that I took away from the last 25 years of doing performance tuning and all things is you need to know your data, once you know your data, and you need to know you know exactly how to performance to certain things. Right. So for example, one of the things that we're doing is we're parsing large files, which contain information about song usage like for stats, etc. Right?
Michaela Light 10:07
When you say log, what do you mean a megabyte roll
Gert Franz 10:11
Michaela Light 10:12
gigabytes gigabytes? Yes. Wow. 20 gigabytes.
Gert Franz 10:16
Yeah, ish? Yes. Wow depends on what were of course, querying there. And there it is important if you do some kind of So for example, if some provider of data is sending you his data from his system, the data in his system is mostly mapped to his internal IDs, right? There might be some global IDs there. So for example, in music, it's the so called international standard recording code is RC that is how you identify a song being that song, right? And this is RC is global and unique per song, or it should be unique and that is made The only thing that is identical in the data that you get from a provider versus your own data. Now, if you want to translate that, which is a character, and you don't want to have primary keys as characters, well, depending of course on the size. But if you want to map those, what you have to do is you have to maybe read everything into a table, and then do an inner join, and try to figure out what is RC is matching to your own is or seen your local database. And that can take ages really ages. So one of the things that I for example, didn't lose his, I used a feature of queries where you can return a query as a struct, not as a query, but as a struct, you do a CF query, and then you use the attribute return type equals struct. So what I do is I read all the 4 million iosr C's that we have and our internal IDs into a Lucy struct and whenever I read the file, From the external provider, I can just do a simple struct lookup, I'll give me that is RC, and then I have all the IDS in there. In the end, this proves to be a lot faster than doing that on the database side. Because you need to write everything to the database first, and then you need to do the updates on millions of records. And that sometimes can be very, very slow.
Michaela Light 12:25
Does that take up a lot of memory, though, on the server?
Gert Franz 12:30
strapped? Absolutely, it does. So that's why I what I do is I only take the most popular 50,000 or popular 500,000 which cover usually about 90% of the data, right? Not all the songs are equally popular. So you take the most popular 500,000 songs, and then that gets you through 90% of the data, and then the rest 100% arrest 10% you can do the lookups in the database. But that is none that tragic, but because he would use a lot of memory for the last 10% of the data there, right? Sometimes you have one asrc being referenced once in one line, and I wouldn't want to use memory for that one. But if you have a very popular song, you might have 50,000 lines with that same asrc. So having that one in your memory already covers 500,000 50,000 rows in your file that you're reading in. I see.
Michaela Light 13:31
So you're, you're optimizing the code by, you know, the, the most frequently accessed stuff goes to memory. And then if that fails, it goes off to that database to do
Gert Franz 13:41
that, yes, but for that you need to know your data, right? You cannot just generalize performance tuning. If you really want to do specific things, then you need to know your data, then you need to know exactly what to do in order to get the last ounce out of it. But just to give you an example, We had the first iteration that we had, which I think was never live. But the first iteration that we have, in order to read, I don't know, 20 gigs of file into the database and updating it with everything took about no lying two weeks. All right? So we have a huge database, so don't get me wrong. And after we have performance tuned it after it came to my desk, and I was looking through it, the end result was two hours, right? So from two weeks to two hours, it's quite an improvement. And there are
Michaela Light 14:36
ColdFusion template that took two weeks to run.
Unknown Speaker 14:39
So yes, yes,
Michaela Light 14:41
that's quite low running template. Yeah, but it's going through a couple of steps. So what you do is
Gert Franz 14:48
run this when it's done, run that when it's done, run that so it's not one template that does it. So we're so there were some several found out how
Michaela Light 14:57
does that work? How does that work? The server has a crash in the middle of that two weeks.
Gert Franz 15:03
Well, that's why it's great that Lucy has something called tasks. Lucy tasks are things like when you send out an email with Lucy, what you can do is you can just write a sea of mail. And if you're using the task feature in leucine, then this is written to a file and that file gets executed. Now, if that file, if that execution is failing, it will retry after a certain amount of time, right? And these, this is what Lucy tasks are for. Even if the server crashes and it comes back up, that file is still there. Once the file has been executed, it's going to be deleted, right? Obviously, if the file system goes haywire, then obviously you have an issue. But that is quite very rare, or they were in a very rare happening. What you can do then is you could for example, move the Lucy file system into Redis for example, or into memcached into a distributed memcached or adatta grid, I mean, just giving an example here. But there are ways around this so that these things don't crash. But saying that it takes two weeks was just an example where we started. And in the end, we ended up with two hours in one template, right and everything written in an object oriented approach, which which is sometimes what there is important is if you want to really squeeze out the every single ounce of whatever you want to do, sometimes what that what I've done is this was a very good, very neat trick. I'm doing this this ingestion of large files for different different file formats. So what I have is I have a general TFC which is then inheriting or this general CFC has inherited or several specific CFCs are inheriting the large One or the the main one, which does the heavy lifting and the specific things are happening in the other classes. So, the classes that are inheriting the other one. So, what I did is in the in the large class, I had several calls to methods that are empty in the master class, because the the specific classes didn't need it, they were only needed in certain events. So, I was just calling them even though they did nothing and just to cover the fact that in a specific case, I needed specific of these functions. So, what I did is, I wrote in the init method, I figured out I read all the methods of my specific class and found out which one is there and I converted the call into an IF call if that method is there, call it if not skip it. And just that if converting that method call, which is a local method call and the same CFC to an If sped up the code by a lot, right? So it took maybe, let's say two hours and 15 minutes and now it takes two hours. 15 minutes doesn't sound like a lot, but if you continue at that pace, you end up maybe even having negative time. Well
you get back here as soon
Michaela Light 18:19
as it stays in four holes.
Gert Franz 18:22
Yeah, before you click before you hit the Return button. The result is already there, right?
Michaela Light 18:28
That was the day I'm kind of curious how and Lucy does the CF looping on a file? Is it sucking the whole file into memory or is it now any boring in one line at a time or at
Gert Franz 18:41
work so there's what you can do and Lucien that is enculturation available as well. You can do CF loop file equals and then you point to a text file or a CSV file or whatever. And then every time so the loop goes line by line and Support index an item that supports and supports it for a while already. So index contains the line number and item contains the line. And then usually what you would do is something like list to array, right, so you have a separator, which is pipe or tap or semi colon or a comma, and then you would just get an array of values. And then you do with those values, whatever you want to do. And you can do start line and then line and stuff like that, which works quite nice. But looping through a 20 gigabyte file is quite time consuming. So just running through the file with CF loop file takes up to, I would say, two minutes, just looping through that file, moving the pointer on the file.
Doesn't take 22
Michaela Light 19:51
if it's a 20 gigabyte file, it doesn't suck up 20 gigabytes of memory, right.
Gert Franz 19:56
No, no, no, not at all. This used to be you know, One of the usually the first approaches which has been done in 20 1314, when distributed is very small. And we only had, I don't know, 50 kilobytes of files, what we did is see have read, see a file action Read, read it into, into a string and do a list to array separated by carriage returns. And then you have all the lines and then you will loop through the lines and then you did a list to array of each individual line, which is very bad, you should never deal with lists unless they are smaller than 200 items maybe. And so looping through a file is really just consuming the amount of memory you need for one line. If it's just that see if loop file, index item index item line, then line is everything that you will read for each iteration through that loop. Now, while this was fast, still 2020 20 or two minutes or one minute, for looping through a 20 gigabyte file is quite long. And in addition, there were some transformations necessary. Like for example, look up stuff, validate stuff, etc, etc bring it into a certain form. And depending on what file type it is, for example, if it is from YouTube or from Spotify, or from Apple, different positions were important, so you have to do a lot of assignment. So what you could do is you could run through all the files, all the columns, and just say, Okay, if it's Apple use this one, if it's Spotify, use that one. So but that is not performant enough. So one thing that I did is I created a configuration file and that configuration file there I just said, okay, for Apple, what, for what column is at what position and then I generated a CFM file which just said, position five. This is the asrc position to his title, etc. And that was a flat file. And including that flat file is much faster than having a loop going over all the columns and just doing all those things. So there are very, very many small things that you can do to to squeeze out inches of performance. But with all of those applied we got from what was it from two weeks to I would say, six hours. But now with introducing something that Misha wrote into Lucy, a function called parse DSP. parse DSP is a Java written function, which you call parse DSP you pass in the file, and then as a second argument, you pass in a component or an instance of the component, and that component has several methods. So for example, header, on header, on body on footer, and unblock For example has a step methods or batch size method, or I don't know what it's called. But you could say, for example, always call this method on 1000 lines. So, after Java has parsed 1000 lines, it calls this method on body and passes in a query, which contains 1000. Records, right? Already parsed and already matched, and etc. So for example, another another issue that we had was, what is if you have quoted delimiters. So for example, a song name might have a comma in there, right? And the separator is a comma, right? Then you would end up having the wrong number of columns if you're doing a list to array. So what you have to do is you have to have quoted commas in there, and you need to discover the quotes and all of this written in cfml just takes a lot of time so we decided to move that Part of parsing to Java. And then we went from six hours to two hours or three hours. And now with the new additional couple of things that they have introduced, we went down to two hours, which is great, right? But
given the fact that these files are growing by about a gigabyte each month. I'm sure we have to do some, some more performance tuning in a couple of months.
Michaela Light 24:28
But so, I mean, just for folks who don't know how long a week is, it's 168 days. So two weeks is 336 eyes and you went from 336 hours to two hours. So it's 130 times faster. Yes. Yeah, that's a big improvement. So now, you mentioned to me before we got on the call you there's some other cool features in loose using asynchronous logging to the database. Yes, listeners were with some of the others. Certainly. About how you're using asynchronous logging and why that's cool.
Gert Franz 25:04
Well, for example, what you want to do is when somebody goes on a certain page, you may be next to the whole logging to l can log into loggly, or whatever we're doing. You want to have some statistics on what you use around who uses a using for other reasons for other purposes. Of course, we have Google Analytics and all of these things. But sometimes you want to use the data in your application for certain things, when did he is last time go to that page or whatever. So if you want to log something on every single request, you would have loads of, of insert into into a table on every single request. And if you have 300 people on the same side, under under site every second, that turns into a massive amount of insert statements that you would have to perform. So we came up with the solution of doing a synchronous logging and how do that is by just taking the the query and adding it to a stack. And that stack then is going to be inserted in a batch of 100, or a batch of 1000. Right. And this is, we first tried to do that with an SN equals true, which CF query allows us to do. But that actually just moved the problem from being visible on the site. So it took a little longer until the page is loaded to something happening in the background. We are now using a synchronous equals true if something on the side happens and we know that there is a longer update query necessary before the user can move on. Like for example, if to me if you need to update some data that the user has and you know that you need to do a lot of lookup and you don't want the user to wait until the whole process is working. Now There is the disadvantage if you, for example, do that in a CF thread, you can do that you have an additional work that you need to do see a thread out and then do all your work. And at the end, you may be sent an email things go run wrong, or notify the user somehow differently. But if you break out into an a synchronous execution of a query, the issue is that you don't know was it successful or not? Right. And therefore, Lucy supports next to the sink, attribute of CF query and query execute. We support listeners and listeners allow you to pass in listener equals component instance. And that component instance needs to have things like on error on before execute on after execute or on after a non before I think that's what the methods are called. And then you can do your own thing in there for example on before you can even manipulate the SQL This is sometimes necessary. For example, if you know that you're on a test machine, you maybe don't want to post to certain, I don't know certain tables. I mean, there are always different kinds of scenarios. And I'm now making things up. But for example, for another client, what we had is that we were our Lucey application was one other application, that means the data warehouse, and since the data warehouse, we have to pay for slots in the data warehouse, the owners of the data warehouse budget needed to know which application uses how much of the resources so that we can assign the the funds use to the according budgets. And in order to do that, every of our single individual queries needed to have a header. Now, how do you do that by not having to go and change all the queries in all the all the source codes, you could of course, do that. specific data source connection. But what we decided to do is use a query listener. And did we define the global query listener to this data source that was using that data warehouse. And in the on before method, we just manipulated the SQL by inserting the header before. And that header just described, the author of this SQL statement. And it's very, very handy because you can react on errors as well. You can log something, send out an email, or whatever. So query listeners are not the only ones. We also have mail centers where you can on before mail, send on mail, on mail center and stuff like that. And so this is what we were using it for, especially handling large amounts of data sets and long running queries in the background so that the user sees the page immediately, but the effects are stored in the background.
Michaela Light 29:58
So it just sounds great. These asynchronous queries is any SQL query that's being run, whether it's logging analytics or
Gert Franz 30:08
even a select statement, which maybe makes no sense. But yes, anything, anything, why obviously you need to be a little careful with transactions because the transactions might break when you're doing something in an asynchronous way. So if you're inside a CF transaction, I'm not really sure how this this knot works, but it could be that that is not possible within inside of a CF transact transaction tack.
Michaela Light 30:34
Okay, and then you have a listener for the CF male tag as well as that one
Gert Franz 30:41
igzo. So so
Michaela Light 30:42
for the male get sent, you can run some checks or you can write if an error happens sending it even.
Gert Franz 30:48
Yeah, so for example, what we have this is also for a client of ours for de risking their information, sending out emails to certain users, there needs to be a certain are a different header in the subject for males, if it's coming from the test server, right? And we didn't want to make specific changes for okay use this subject or whatever now we've just created a listener for male and in the on before we just figured out how you on the test so it yes then add that or put a prefix to your email subject. So that it is working like that. So that is that is really fun.
Michaela Light 31:37
Well, that's cool. Are there any other listeners in Lucio?
Gert Franz 31:43
I think we're, we're creating additional list listeners. Pretty soon. Pretty soon. I don't know maybe on HTTP, you know that we have things like, like caching that we've introduced to functions to, to file the even to http calls. To web services. So the cool thing is when you're using Lucy and I use Lucy every day, you sometimes detect the need for new things. And sometimes we just say, Okay, let's rewrite loose your extended Lucien what we did is we created a distributed extension, which we use that distributed because some things make sense only for distributed. And some things make sense to the wide community.
Michaela Light 32:28
And how does that how does that work? Is that a fork of Lucy or are you kind of apply a patch to the local version? Oh, no,
Gert Franz 32:36
no, Lucy allows you to create extensions, you know, I don't you ever seen that. And Lucy, an extension is called Le x. So Lex. And what you can do so for example, the PDF extension, if you don't want to use PDFs, you can just install the bare minimum of Lucy, which is I guess 21 megabytes and that doesn't contain any Anything, right? It just allows you to execute Luci to connect to your A to the h2 database. And even if you want to connect to MySQL, you need to install the MySQL extension, right? Because you don't need to carry around all the weight of other things. So why not just have a very very slimmed down Luci version. So I guess command box for example, the smallest version of command box doesn't contain anything. So no PDF extension, no image manipulation, extension, all of those things we have about I don't know I would say about 25 to 30 extensions that you can choose to opt in or opt out. But what we created is we created a Lucy extension called lucid districted extension which contains the things that we wanted to have changed so additional functionality that we use in Lucy are at distributed with Lucy but are not either not yet ready for the public, or are just too specific for district in so that it is not going up. So for example,
Michaela Light 34:10
does that mean certain certain functions and tags work differently in your code base than they would for a No,
Gert Franz 34:17
no, they're just additional ones.
Michaela Light 34:19
Oh, additional features. Okay. It's not writing. You're not overwriting the existing
Gert Franz 34:25
words, babies don't like.
Michaela Light 34:28
Please don't let you do that.
Gert Franz 34:30
We could do that we could extend certain functionality. But we mostly use it to create new functionality. Like for example, we have a function list to struct. Now, you might think, list a struct makes no sense. But what you could do is you could just say here is my list of columns that I read from a file, and I would like to have it return a struct which contains the column name and the value, right? So you pass in a list of column names. And the list of values and that is assigned to a struct.
Unknown Speaker 35:05
Michaela Light 35:07
So anyone recreate these and they're written in, in code in cfml. Or in Java, they can be
Gert Franz 35:13
written, they can be written in cfml. They can be written in, in Java. So one company that I know does their deployment with Lucy extensions, because what you can do is you can write your Lucy extensions completely in cfml and bundled them into a Lex file, which then deploys to the web root, right. So the Lucy extension contains several sub folders. One sub folder is for example, the lib folder and the lib folder contains bundles of JAR files that that are then deployed into the Lucy web container or server server container, and are then necessary maybe to execute that extension. Then we have Have the TLD tag library descriptor. The tag library descriptor contains the structure of the new tag that you may be introducing. Or we have the FL D, which is the function Library District descriptor which describes the new function that you're designed defining for your Lucy instance. Or you have slash applications and slash applications inside this Lex file is deployed to the Webroot. So what you can do is you can deploy your complete application has one single file as a Lex file and throw that into the deploy folder in the Lucy web bin directory. And that gets deployed to you
to your environment. Right some people do it like that, because this allows you to version your extensions.
Michaela Light 36:53
Wow, that's really cool. Now you mentioned that the minimum install on Lucy is 21 Meg's So I'm assuming that's really useful if you have a auto scaling cloud deployment of your app because it could load up Lucy really fast how fast is it? How many seconds does it take to load Lucy? minimum
Gert Franz 37:13
Well, on my Mac for example, takes about two seconds but I'm still not happy because you know command box heavily depends on how fast Lucy is starting up. And that is one of the features that we're working on for Lucy six is that the startup time will be somewhere below half a second, right? So that you follow us so that you finally can use Command box really like that and even a shell script without a large delay. So that is one of the major focuses on Lucy six, as far as I remember, but it's one of my main focuses that I want to put an emphasis on. For us a district is not so important to how fast it starts up because we are good Towards the bluegreen deployment where you're first deploying to the green servers, wait until they're up and warmed up and then you switch the load balancer to point to the new servers and then you upgrade the other ones. Right. So that is targeted where, how we're doing that.
Michaela Light 38:19
Okay. And then
the other thing I think you mentioned, you've been using a new codebase is you've got a way to get CF tags into CF script. Can you tell it? Yes, yes.
Gert Franz 38:34
You know, since you know it yourself, we are veteran cfml developers and amongst the veteran cfml developers, there are some that just love writing in tak right. And sometimes there are tags like CF query CF mail, which are already as soon as you open and close the CF male or CF. cf query tech. You're already in output mode. Right, so you can just use hash, for example, in queries, and you don't write any to write CF output and put a hash in there, or functions for example, like that. And the problem there is that if you want to convert that into a proper tagged version of, of the script version of the tack, it's not really possible because in Lucy, you can always write every single tag that isn't cf cf tag, you can convert that into a regular tag by just taking away the open open bracket CF, and the closing one and just use a just take away the CFX and put a curly bracket at the end and call it rest at the end and at the bottom of the CF loop or whatever that's true for all the tags. All the tags, yes. Wow. And But still, if you use that for query, you still need to do something like echo in there with it within the the SQL code. So that's why there are things like query executed or the query notation for script. But what you can do and this is also very helpful. If you have people that are reluctant to move to CF script, you can give them a way out by allowing them to use CF query inside CF script by just introducing us a tag block in CF script with triple backticks. Right, this is the usual notation for, for Wikis that you have the trick of triple backticks for code. And it is also useful when you For example, write cfml code to generate front end output, right, like divs and allies and whatever. And sometimes you get into problems there with a single or double quotes. So going into the tag mode allows you to just start off by Having, well, already scripted is output or stuff that you just already output. And it's also very helpful to convert people to use more scripted versions of their code script is very good for preventing output. That is the main goal of using CF script versus using tag. tag is way more verbose. So you have more to write, and it generates way more outputs, then script script doesn't generate any output at all. That's why so
Michaela Light 41:33
you're talking about what else?
Are you talking about blank lines that appear in the output? Yes, or something else? Okay. I was wondering where those came from.
Gert Franz 41:43
Just what but just imagine if you're in a regular cfml template and you write a CF loop from one to 10. And then your next line, you indent the line by a tab. How should your HTML know that that tab is not intended Right. So it needs to be there, it is absolutely clear that every single carriage return or tab needs to appear because HTML has no idea that it's not supposed to output that. But in CF script, you deliberately choose to output something. And that's
in ColdFusion, you have these crutches like enable CF out prodotti. This is just only to prevent whitespace or you have the whitespace management, which prevents everything from being outputted if it's not inside a CF output, or the smart whitespace management which takes every whitespace that is followed by a whitespace and removes it. But all of these are really poor solutions for a problem which is coming with the tag notation itself. So now not Yeah,
Michaela Light 42:54
Yeah, why not avoid it in the first place? And and also, I think it just looks like more modern code if it says
Gert Franz 43:00
Michaela Light 43:08
Gert Franz 43:30
Michaela Light 46:56
towards Lucy or Adobe, Adobe ColdFusion Man,
why'd Why do people move to cold fusion? Do you feel?
Gert Franz 47:07
I don't think that they're moving to cold fusion there may be
sometimes somebody else convinces them with their own passion and love about why they using your confusion. It's just like, like a drug sometimes like religion, because once you're hooked, you're maybe not going away that fast and you are very reluctant to go back to anything else. Because it is just so nice and beautiful to use. And I still love working with it. I've worked with other languages as well. But still, not only because I'm part of the Lucy core team,
I just like programming with it. I remember first time I saw cfml I thought, What the hell is this? And it's so backwards and it's generating so much output and so ugly tag based I was coming from the Java world, and delfy world. And for me having non typed variables was very, very unusual. But in the meantime, it's the center of my life, my professional life, at least. I can I still love it.
Michaela Light 48:20
Yeah, it's interesting you describe, you know, which language people pick as almost a, you know, a religion because it does seem to know, people don't always look at the facts of how easy it is to code in something or, you know, what it what it truly costs as opposed to what it seems to cost. And yeah, you know, I was talking with a CIO the other month, and he was talking about dotnet, because, you know, famously dotnet is free. But just by the way, you've got to spend thousands or 10s of thousands of dollars on Microsoft Windows servers, and Microsoft SQL Server, enterprise licenses and the whole ecosystem. I mean, it's Right, I don't know accused Microsoft of, you know, as being similar to drug dealers, you know how drug dealers sometimes give out free samples, you know, hook you in your ecosystem. And it's almost like, you know, that's how dotnet is set up, they say it's free, you know, and they, they have lots of free resources, which is great, but like once they've sucked you in boy, do they want you to write big checks.
Gert Franz 49:24
Still, I think having VS code is one of the greatest thing that I ever did.
I love vs. code. And yeah,
I totally agree with you. And for me, you know, the reason why I think this is more or less like religion is because the thing that you are very good at, you don't want to give up, you know, and many of the people that I know are very good at cfml and we brought on board a couple of the greatest minds in our, in our realm and they are all sharing Passion. And you can see if somebody knows something very well. He's passionate about it. And he can talk for hours about the same thing and even bore people to death about about well, he's starting again with that. Yes, but that shows the passion that people have. And I have that passion. I have the passion for Lucy for a very long time. And I can understand people having the same passion for for other languages. I'm not saying another language is bad or good. Nowadays, it doesn't really almost matter which language you're using. Because they're all compiling down to the same thing. Some might be a little more flexible, some might be a little more fancy, have some better documentation. But in the end, they're almost all quite the same. And having the core team being part of the team that I am in, gives me a lot of confidence and what we are using, and
Michaela Light 50:57
I think that's true that you know, Most computer languages let you do most things. I mean, they're all whatever that I forget the phrase from jack of all trades is it finite state machines or?
Gert Franz 51:11
Yeah Oh yeah,
Michaela Light 51:12
yeah but it doesn't necessarily mean you know, I mean I can use a screwdriver to pound a nail into a wall but probably a hammer would do the job better you know?
Unknown Speaker 51:22
There are different
Michaela Light 51:25
Yeah, sometimes you want a sledgehammer sometimes you want a little hammer. So I think each language you know, has a sweet spot where it's good at there's a lot of overlap, like you say where it doesn't, you know, you could use this language or that language. There is a question of how fast can you get the job done and cfml generally is is really fast. I mean, yes, you know, if you have a manual screwdriver versus you've got one of those electric drill screwdriver things you can get a lot more screws in with the electric drill screwdriver. That is Tom and sometimes you know, you No, Grammarly is less lines of code and faster to write the code. I know I would say faster to debug. So let's wrap up by, you know, talking about why you're proud to use cfml these days?
Gert Franz 52:16
Well, I can give you the same answer. I guess I gave you the last time because I'm part of cfml and I'm part of shaping the future as well of cfml. And hopefully to give it some long years, and as you said, allowing me to do my job, very prolific and very efficient. That is why I love cfml Hmm,
Michaela Light 52:43
well, I mean, ColdFusion I was gonna say cfml has been around for 25 years, actually, it was dBm l for those of us who remember how it started off, but it really was cfml. They just had a different tag prefix back then. And I'm sure it'll be around for now. for 25 years at least, if not more, I mean who knows what's going to be happening in 2045 you know probably will be still all locked down from this crazy situation. Now I'm just getting there. I hope we're not still locked down from this crazy situation. But you know what, what would what do you think it would take to make ColdFusion. Lucie even more alive this year.
Gert Franz 53:30
Unfortunately, the conferences are a little missing.
Talking about it more sharing the passion sharing the love. And that is what I encourage people to do. What I noticed very often is that many people don't show up at conferences, never heard of conferences, never heard of podcasts, never heard of meetups, even virtual meetups. They just there are there in their nine to five jobs and they never raise an eyebrow and never stop. Look above their, their horizon. And that is something that we can we can hopefully bring people without passion to look out for other fellow cfml programmers because of many we don't know.
Michaela Light 54:19
It's like the dark matter of the cold fusion universe.
Gert Franz 54:22
I can't imagine. Yes, I can imagine. And it's just some astrophysics. Thank you.
Michaela Light 54:30
But what why do you think some, you know, it's hard to tell how many CFS are, you know, a silent because, of course, they're silent. We don't know they exist. Hence the phrase dark matter. But I mean, we know from the number of downloads of Lucy that there are many more people using Lucy than talk about Lucy. Yeah, that is Trina. 100 times as many
Unknown Speaker 54:53
Gert Franz 54:54
well, I would say no, I wouldn't say 100 times as many but for example, if you take If you take the average people that are verbose, that are speaking out of being maybe I don't know, known people 303 400 people that are speaking actively about it, I think 100 times might might hit the number. I think that might hit the number and the best days of rylo. I remember we had around 80 to 100,000 people that were downloading individual IP addresses that I know that we're downloading Lucy, and we even did once a check of all the websites that are out there, just need to go in a couple of minutes. couple of websites that are out there seeing on what which engine they are and we came to a similar number of individualized server.
Michaela Light 55:50
Maybe we'll talk a you've got another meeting you have to go to so yeah, well, I'd love to come back to this topic of how to encourage people who are just listening To this podcast, but never tweet or blog or whatever to do that. So
anyway, if people want to find you online, what are the best ways for them to do that?
Gert Franz 56:12
Well just send me an email, go to my LinkedIn, perhaps LinkedIn profile, what's your email? My email gota gratiae.ch or Gert at distro kid calm, whichever you want to use. I'm actually not that present on Facebook or on Twitter or on Instagram or any of these other things. It's just me being maybe lazy and I know that once I'm on them, I will not stop talking or stop browsing or stop doing things. For me. It's a time sink.
So that's why I stay away from them.
And it doesn't put me into jeopardy of tweeting something stupid, like others.
Michaela Light 56:51
You'd never tweet anything stupid.
Gert Franz 56:53
Ah, come on.
Michaela Light 56:55
Fancy my wife.
Gert Franz 56:56
My wife thinks otherwise. Oh,
Michaela Light 57:01
All right. Well, I appreciate you coming so much on the show. Good luck with Lucy six. When that launches, we'll let people know about that and love to have you and some of the other Lucy luminaries, if that's the right phrase. come on the show and tell us more about the cool features coming out in the next version.
Gert Franz 57:20
Thank you very much for having me and talk to you soon. Take care
Transcribed by https://otter.aiMichaela Light is the host of the CF Alive Podcast and has interviewed more than 100 ColdFusion experts. In each interview, she asks "What Would It Take to make CF more alive this year?" The answers still inspire her to continue to write and interview new speakers. Michaela has been programming in ColdFusion for more than 20 years. She founded TeraTech in 1989. The company specializes in ColdFusion application development, security and optimization. She has also founded the CFUnited Conference and runs the annual State of the CF Union Survey.
Join the CF Alive revolutionDiscover how we can all make CF more alive, modern and secure this year. Join other ColdFusion developers and managers in the CF Alive Inner Circle today.
- Get early access to the CF Alive book and videos
- Be part of a new movement for improving CF's perception in the world.
- Contribute to the CF Alive revolution
- Connect with other CF developers and managers
- There is no cost to membership.