You can listen to the podcast and read the show notes here.
Michaela: Welcome back to the show and today we're going to assert control over you'll legacy ColdFusion applications out with a TestBox Quixtar and I'm here with Ed Bartram. And we're going to look at what is unit testing what is TestBox, what is he think a legacy application is anyway. We might delve a little bit ancestor of development. I'd look at how to install TextBox quickly and we'll look at issues of Cold coverage and what tests to write and all kinds of other best practices and tips and Ed is talking about this at Into The Box which is in less than a month's time our record date.
And just in case you don't know Ed, he's been developing in ColdFusion since the beginning of the McNally, since the year 2000 that say it that way when ColdFusion was backing version 4.5 and he's the Co- Manager of the Chicagoland CFUG and previously the manager of Nebraska CFUG, so he's got like lots of accolades for CFUG work in the ColdFusion community and he's also been to many ColdFusion conferences so, too many to name I would say. He is an owner, a proud owner of a 1973 Volkswagen Beetle. Probably runs on ColdFusion I'm guessing but, so welcome Ed.
Ed: Well, thank you Michaela. It's an honor to be speaking with you.
Michaela: So let's just start off, what exactly do you define as a unit tests?
Ed: Well, unit test basically test the unit. So unit is a small piece of code it could be a single message or function or even a group of message or functions depending on how you want to generate the tests. But units test will tests that unit of code and it will test all branches of logic to make sure that it works well and that it does what it's intended to do.
Michaela: And the idea here is if all your methods and functions are working correctly then you've got a better chance the whole ColdFusion app is working okay.
Ed: That's correct you end up with code that you can trust so you're more easily able to make changes confidently on app code.
Michaela: I think you're going to use the R. word for re factoring in a moment if we're not careful.
Ed: Yeah that's a great part of our writing unit tests especially when you're dealing with the legacy code base. Legacy there's so many misconception about what that actually means so some folks think that legacy means that it's just a simply old code, but as Michael Feathers describes legacy code in a similar book working effectively with legacy code it's simply code without tests, so old code with test not considered legacy code and you can even have new code without test that would be considered as legacy by that definition.
Michaela: It's a very intriguing definition because I thought you were going to say it was full of spaghetti or you know used old syntax or something. But you're saying if it not got good set of unit tests comparable code than its legacy.
Ed: Right and there certainly could be spaghetti code in there and probably is and that's part of the problem as well when you get that spaghetti code people become afraid to actually make changes to it.
Michaela: Yeah, so the ability to make changes your code is one of the benefits, what are the other benefits of going through all this unit tests and TestBox?
Ed: Well like I said it makes sure that your application runs without error and does what it's intended to do. So you can use these to us any time to ensure that it runs correctly and it's particularly useful when you make changes to your application to find out something breaks anywhere within your code base that you have test written and it’s also useful when you're deploying to different environments, particularly your production environments so if you run these tests as part of the build process test and if any test fails you can have your build process actually stop before deploying that broken code to production.
Michaela: That sounds a great thing and I think we'll talk more about that later in continuous integration, but first you've just defined unit tests, but are there other kinds of tests like integration tests or maybe other kinds.
Ed: Sure, unit test are typically testing are the logic within a particular method or group of methods and not anything else that those methods are and that includes external dependencies such as a database. So what we do is we write integration tests which are structured much like unit test to test those external dependencies to make sure that the code that you have that's making those calls out actually work correctly without error and then you can even in the case of database calls insert records into it in order to do your unit tests against let's say, read function to make sure that it's coming back correctly so it's doing what you intended it to do.
Michaela: And when you're writing these unit tests are you testing the logic or dependencies or what exactly you're testing there, Ed?
Ed: Well the unit tests are testing the logic so anytime you have like [inaudible] [06:16] or any other place where a code makes a decision; you want to make sure that it all gets tested on both sides of the branch. (So I'm losing track of where I was going on that one). Could you repeat the question?
Michaela: Yeah, you know are you testing logic or external dependencies so unit tests are just testing that the logic is correct doesn't blow up when you called the functional method with different parameters.
Ed: That's correct yeah and the integration test is protesting the external dependencies.
Michaela: And when you say external dependencies what will be some examples of that?
Ed: Well the external dependencies would be like a database calls out to your file system Ajax calls to other systems. So anything that you have full control over.
Michaela: And does not include human inputs into the app or?
Ed: But that would be more I think less maybe functional tests like working with Selena or something like that. So what I'm talking about with the unit test is after the data comes in so you could emulate that data coming into your code like a user would keyed it in, but we're not actually testing with the user does itself.
Michaela: Okay, so this doesn't involve any human being it can be totally automated using something like TestBox?
Ed: Yeah it should be totally automated.
Michaela: And you know just to look about because a lot of people used to manually testing their codes, what are the advantages of writing unit tests? I mean effectively you're writing a second set of codes on top of your app that's going to test all the different units so that could be a lot of work, why would you want to have automated testing?
Ed: Well some of the things that make a good unit tests is that it should be repeatable so you want to be able to run your units us again and again and get the same sort of results. It should also be easy to implement so if you're having a hard time it's really some other kind of test possibly an integration test. Unit tests should be relevant tomorrow, so that the test that you write today they should be able to run and function and act the same way they do it again tomorrow or the next day or a year from now. Anyone should also be able to run these tests with the push of a button so that's where it comes into the automation part.
So with that push of a button you should be able to run one of your unit test, a test suite or all of your unit tests. You should also be able to hook it up to a continuous integration and a system such as Jenkins and so that when you make a commit to say your repository that would then trigger something to Jenkins which would fire off the build and part of that build process would be to run your unit and integration test before it actually then goes to the next step which would be to deploy to whatever environment you're working with.
Unit test should also be are consistent in their results so if they're not consistent your unit tests are going to fail and sometimes people do write some brutal unit test, better data dependent and when you promote from one environment to another that data may not be the same that's where you catch those brutal test sometimes because of when they start failing. So you want to make sure that you're not dependent on any data that's in the database for example with immigration test. You should have a full control of the unit under test so every logical branch of that code should be tested and they should also be fully isolated so the test should run independent of each other.
One of the things that TestBox does is that it runs them in random order so your first test isn't always the first one that runs. That way it eliminates any possibility of your tests being dependent on one so you can't just set up a scenario in one test and then a second test depend on that scenario because you never know when they're going to actually run. So that forces you into doing it correctly. So unit tests shouldn't be dependent on another one nor should it affect other tests that are running. And then finally when a test fails it should be easy to detect what was expected in determined how to pin point the problem.
Michaela: Well that sounds a comprehensive list of good practices for testing. Does TestBox fill all those requirements?
Ed: Yes, it sure does.
Michaela: Now you mentioned that you should be able to run the tests independent of whatever code is run, so is that mean your tests have to do some pre set-up and put dummy data in the database for example so the method gets whatever it's expecting.
Ed: Yeah depending on what your test needs to do. At a minimum you want to be setting up a variable pointing to your ColdFusion object. That's a good point that I hadn't mentioned before is that your unit testing and integration testing are testing methods and functions within the CFC, so they're not testing many of the code was in your CFM. So one of the things in order to get to the testing is you need to move your logic into CFCs so that you can test them from those unit tests and integration tests.
As far as setting up database stuff that's usually meant for integration tests so like I said if you're testing a method that does say a database read, don't count on that date actually being in the database depending on the environment you're working in so you're going to want to have code in your unit test wrapped and CF try catch and CF transaction so that you can go ahead and insert a record into that database then call that unit under test.
Michaela: Okay so you write some set up or a code to create those records in the TestBox and then you write tear down code after the test is done to clean it up, is that how it works?
Ed: Yeah that's correct because you don't want to be basically leaving test data out and your database is depending on what environment it is because particularly if you're running it on your production environment you don't want test data floating around out there.
Michaela: And this TestBox makes will only one test runs at a time or because I could imagine if you had multiple tests and they are all inserting things in the database or updating it or whatever they could step on each other's toes so….
Ed: It usually does run synchronously but are there is a way to run a asynchronously and there's a few gashes down there that you need to be aware of so that you don't accidentally have your test stepping over each other like that.
Michaela: Okay so TestBox is basically a place where you can script out tests and has a special testing scripting languages, is that how it work or what?
Ed: Sure, TestBox is a framework for writing test so there are some really smart people out there who've gone head and made this easy for us. So within the framework you basically have four methods that are at your disposal. There's a basically is set up and tear down and that runs before all the tests, before the test and we run in after and then there's before tests and after tests that run before and after each individual test. So if you've got code that you want to share between the tests you can put it in there or you can do all your set up in and tear down within the unit test itself.
The unit test is just basically a method or function depending on how you want to call them and they typically start with the word tests, what I do is a convention is that I will name the unit test after the method that its testing so if it's a testing of a method called Bre data, it would be called test Bre data. I also do something else that's a little uncommon, is I will add in some more English after it so I put in an underscore and then basically when so it's what you're testing and then another underscore and what the expected result is so it might be something like, test Bre data when I did not past returns empty result set, something like that would be the actual test name. So that when you run those tests you actually see that full test name on the screen or in the report so you know exactly what happened.
Michaela: Okay, so when you are running TestBox you've got hundreds of tests and if they fail it will pop up in a reports showing you what's fail, s that?
Ed: Right, right it will show all your past tests in green, it will show anything that error out in red and it will show any failures in the kind of yellowish orange color. Failures are at the end of your unit tests you read basically in assert. The assert basically declares what is the expected response from the tests that you do. So if the assert fails, the test fails and you'll see that instantly on the report and it will be in yellow.
Michaela: Okay so part of the scripting language you have in TestBox you can say certain assertions for exit conditions from the method or what should have been changed in the database or what exit code it should have or anything like that.
Ed: Correct yeah and there's a whole library of these assertion codes Assertion Statements, actually they are functions. So you can test if something's equal, you can test if something is the type of you can….basically anything you can think of TestBox handles that assertion.
Michaela: And then you have to watch the tests will or can you programmatic tell whether all the test pass or not?
Ed: You don't have to watch them all. Typically if you run the test manually you're looking at it, but you can also set up but like I said continues integration with Jenkins or even [inaudible] [18:14] during your build so it will run everything and you can have it watched to see if anything fails and then they can report back that failure or it can send a message through chat whether using Slack or Hip chat it can send out emails to you if you've got like a raspberry pie set up or something you could have it ring a bell or something, pretty much whatever your imagination can think of you can pretty much set up to do through automation.
Michaela: That sounds quite entertaining having Raspberry Pie ring the bell every time that a test fail; it might make quite a noisy environments in certain teams.
Ed: Sure, but one of the things is that if you have tests that fail and that's part of your build you want to get those are resolved quickly so you really shouldn't proceed any further you should go back fix your test before you do anything else so that you get those tests running correctly. You don't want to be the guy who breaks the build.
Michaela: Right, because you might have to wear that silly hats for the rest of the day because you were the guy or girl who broke the build.
Ed: Yeah, I've been on a team that the hat that you had to wear was a tiara.
Michaela: The other the prince or princess of the bill breaking.
Ed: Yep.
Michaela: Now if you're using everyone knew… so if and maybe you even have a board on the wall number of days without having an accident making of build you know, like construction sites do, right.
Ed: Oh sure yeah definitely. I'm thinking of The Simpsons intro now.
Michaela: Now if you're using a continuous integration tool like Jenkins presumably it can tell the tests failed, could be not deploy the code to production if there were test failed or not deploy it staging or whatever part of the build your automating.
Ed: Exactly yeah that would be ideal you don't want to deploy a code that isn't working, whether it's an error or failed assertion you want to make sure that it's every one of those test is working and functioning as intended, if it doesn't you really do not want to deploy the production.
Michaela: So this this really makes it much more likely the codes is thoroughly working at least at the unit level and not only was it working on your development server because you could run these tests on your development server I assume and then they get run again when it gets deployed to staging, it is that; and again when you deployed to production.
Ed: Right that's correct and that's how you don't break the build is to make sure they run on your local environment first before you ever tried pushing it out to any kind of shared environment.
Michaela: Okay so and TextBox is it expensive for free or what's?
Ed: TestBox is free and you don't need to be using the call box user system to use it either. TestBox will run on any and every framework in ColdFusion including no frameworks. So everybody can and should use it, it is currently the recommended testing framework. Amex unit used to be, but that has not been active in quite a while now, but the good thing, good news for those developers who are still using it, is that TestBox is fully compatible with Amex unit, so if you just drop TestBox into your code it can and will run all your tests without having to make any changes.
Michaela: Well that is great, so if you have a test what you previously wrote in Amex unit you can just plug it straight in and it will pick up that legacy test syntax or you can choose to write new tests using the new syntax in TextBox.
Ed: correct
Michaela: Now although TestBox is free if you want support other like paid support packages or….
Ed: I'd have to look into that I'd imagine or just solutions who make TestBox would be happy to [inaudible] [23:49] your consulting services. There are also a places on the internet that you can get support as well, I think particularly the CFML Slack Workspace. There's a great one there's a testing channel there, that you can ask questions to and lots of friendly people will answer them.
Michaela: That is a great resource that both the testing channel and all the other channels in the CFML slack. If anyone listening is not on that I'd recommend checking it out, free to join so you know. Now, how does TestBox relate to MockBox, are they one and the same these days or..?
Ed: I believe MockBox is part of the TestBox, so that you can use it to mock objects and mock methods the reason why you want to do mocking there's also Stubbs inspires and other things but I don't really get into that in my presentation, but mocking I think is a really important concept to understand and makes you able to write these unit tests.
Basically if you have a method that calls other methods when you write the unit test you only want to test the code that's within that one method that you're writing that unit test for. You don't want to run the other methods so you would write a mock which would tell the TestBox framework that instead of calling off this other method just returned this hard coded value or whatever you want to return instead.
So that when you run your unit test against that one method, when it calls the other methods it just gets the supplied values that you've given it instead of actually making those other calls out to the other methods so that allows you then to test just the logic within a single unit method that you're testing. Like I said there's other ways of doing that too with Spies and Stubbs and all that too but I really only go for mocks with which I think is the main important one to get and then once you get that you can then branch out with your knowledge and figure out what some of the other tools do.
Michaela: It's the obvious that mocking is if you're in it a team and maybe some of the code hasn't been written yet, right you need to test your build depends on someone else's code that doesn't exist and that you'd mock that out so you could do run your test.
Ed: Oh for sure. Yeah MockBox it so that you know there's plenty of times with the current team that I'm working with that we don't quite have all the database stuff set up yet so while waiting for that we just mock it up and run from there and then when it really comes available then we can call that in our integration test.
Michaela: Now how does this relate to test driven development? Is this what you'd used to do that and maybe we should just tell listeners what test driven development is.
Ed: Well, Test Driven Development is the process of writing your tests first and then you write your code that makes the test pass after that and that's usually done in very small increments see you write a little bit of a test that fails then you write the code in your code base to make that test pass then you modify that test a little bit so that it fails again and then you write a little bit more code in the system to make it pass. They call it a red, green testing.
What I'm really covering in the presentation as more of just using unit testing and integration testing for getting control out of your legacy applications, so once you've written all your test then you want to move into something along the lines of our test driven development. But if you've already got a code base and there's no test you can't really do that TDD just because the code is there and there's no tests. So you need to pick apart your code you need to re-factor it into smaller chunks that's easily testable and then build up that suite of tests.
Michaela: But the end goal of writing tests for legacy code is eventually you've got tests covering all the code and now when you write new features you could do test driven development if you wanted to.
Ed: Certainly yeah, if you're writing any new code any Greenfield code you definitely want to be using more of a TDD approach writing those tests first and then writing the code to make them pass afterwards.
Michaela: Now I know a lot of people are used to just churning out code and then kind of doing testing as an afterthought, so the concept that you write the test first and then you code is quite radical. What's the benefit of taking that approach?
Ed: It's really difficult to get yourself into that mode of writing those test first and then writing the code after, but one of the benefits that I see is that you're actually writing test that are more meaningful to the code. So sometimes when you write the test after you've written the code you're just writing test that do what you believe it's supposed to do. When you write the test beforehand those to pass basically becomes your project requirements and that eventually moves you into something like BDD behavior driven development where you've got a user acceptance criteria that you're testing against.
Michaela: Now, if you want to install TestBox into your existing legacy environment how do you do that quickly?
Ed: There are a couple ways you can do it if you've got command box set up you can pretty much run a box command to install it, but if you don't have that available you can just download a TestBox and others; basically one folder was a bunch more folders under that and you drop that in the root of your folder and you've got TestBox. If you don't want to drop it into the root of your code base, you can put it anywhere else and then just straight a mapping to it.
Michaela: Okay, so it's pretty straight forward and I'd recommend the command box, but it's not like command box itself is hard to install and it also is free and it also gives you some other testing benefits if you've got to deploy against multiple versions of ColdFusion so definite advantages in using command box I would say. Now–
Ed: Sure and then like TestBox command box doesn't requires that you use cold box either, you can use any framework that you want.
Michaela: Yeah, it's almost like that box at the end is kind of accidental you know, t's not like that really related of the same group of people who have created these type of tools so. Now it suppose you have install TestBox what is that and you know you've got legacy code and there aren't any tests, how exactly do start? Because it's like to be a bit overwhelming you may have tens or hundreds of thousands of lines of code none of which have tests, where would you start if you are dealing with that?
Ed: Pretty much I would start with whatever bugs that you're currently working on, whatever code changes you're working on. I like to tell the other developers on my team that if you're touching the code and there's no test for it write a test for it first. Make sure it passes and then go ahead and modify the code. So basically you write the tests as you go along continuing to work you don't just stop all work and write tests for everything. You do it as you go along.
One of the ways I like to do it, is to first pull out the queries so a lot of times with old code you see everything all just kind of in line, so I'll start with the queries I'll pull those out into their own methods and write an integration test for that. Then once all the queries are pulled out then you're left with the logic. The logic then you can start reducing into more sensible methods and then write unit test against those methods to make sure that they're doing what they're supposed to be doing.
Michaela: And when you're doing this presumably you know if you see the same query in seven different places. It all goes in one message not in seven different methods.
Ed: Oh sure yeah, it allows you to reduce all those calls into a single one for sure and it just testing in general leads you down the path towards better a little OP practices.
Michaela: Now are you assuming that people are writing a NVC kind of app when they do this that it’s got model view and controller parts to it or…?
Ed: Well that's certainly helpful but I don't think it's necessary. A lot of times with legacy apps they're just all written in line in CFM. In that case you need to start moving like I said the queries and the logic out of those CFM. Treat your CFM like your view, whether you've got a framework for not your CFM's should be your views. Your model is basically all your CFCs so move your code into CFC's if you don't already have it there.
Michaela: So you can either shift to NVC which once you've got all the queries in separate methods and you've got all of business logic in separate method is actually would be a pretty easy third step or you can just say it it's fine and how it is.
Ed: Yeah, I strongly recommended it. Implement some sort of framework and whether it's cold docs or framework one or anything else or if you've got some sort of home grown framework that you're using at your company you know you feel comfortable with that and that's what you need to do continue using it but you want to start separating out your logic from your views.
Michaela: Now I think that's a good idea but why do you think it's good to separate out the views from the logic and database stuff?
Ed: Oh well, so radically if you completely isolate your views that makes it easier for your application than to be modified for other things so if you wanted to write like a mobile app for it or if you doing say a lot of reporting, being able to just generate your reports as PDF instead of relying on all your logic being inside of the CFM for displaying it in HTML. So it just makes it more affordable.
Michaela: Right that's a great point and also from a testing point of view I assume it makes the complexity of the code less and reduces the number of bugs you can have from dependencies between you know few pieces of code, logic and database because now they're separated and you know if a message fails exactly what it's doing.
Ed: Right, and then you're CFM and your views can then be tested by [inaudible] [36:47] folk using something like selenium, so you can cover all your bases that way.
Michaela: Right, and selenium is the scripting testing thing that will pretend to type keystrokes or click the mouse and let you simulate having a human use the app, but it's all scripted.
Ed: Right, right.
Michaela: So before we turn to any other questions about ColdFusion, Ed, is there anything else you want to share about TestBox and taking control of your legacy apps?
Ed: Well I want to let folks know that while testing isn't generally considered a cutting edge or hot topic most people consider it more like a utility like plumbing something everybody knows that they should have, but I found in my experience that there's many development shops that don't do testing and usually it's because of a combination of reasons. They don't know how to write them. They think it'll take a long time to write them.
Where it's either too hard or too tedious to write or they think they already have a code base and it's too late to write tests. When I first started writing tests about six years ago, I found them to be very useful tool to add to my tool box. I've been working with well-established code bases most of my career now while working with ColdFusion and much of it has traditionally been legacy.
Sometimes that code is so knotted up in complex but everyone is afraid to touch it for fear of breaking the system. Like any large knot you can't attack the problem all at once, you need to tease out the smaller knots which eventually leads to unraveling the larger knot. So by isolating these pieces of the legacy code and to their own methods and writing tests for them, I'm able to break up that complex code that everybody fears into something that's easily understood and trustworthy. So pretty much any developer will tell you that they enjoyed doing puzzles. Testing is one of my favorite tools for solving puzzles with an existing code bases.
Michaela: And I think also just by writing the tests that makes you think about the code and realize some issues in there even without the test running and failing you may realize, oh wait a minute I forgot to deal with this condition or that condition.
Ed: Yeah that's definitely correct as well. It allows you to double check your work.
Michaela: Great! Well, I'm excited that you're talking about this Into The Box. So let's just turn to some questions about ColdFusion, why are you proud to use ColdFusion?
Ed: Well it's been a language that I've used 100 percent of the time for the last 18 years of my professional life, so I love working in ColdFusion because it's both comfortable to work to work in. It also stretches me as I continually find new things that I can do with it in a really nice….. There’s been a lot of effort over the years to evolve ColdFusion and to keep pace with other languages and modern programming concepts.
Michaela: Yeah I absolutely agree with that and I really feel in the last few years between all the box products and other initiatives to make ColdFusion more modern and alive language that really it is alive and that's one of the reasons I started the CF Alive podcast just to showcase the cool stuff that you are doing and other people are doing to use ColdFusion in a really modern way.
To be honest a lot of the stuff doesn't actually exist in other languages you know some of this cutting edge stuff we're talking about on the show you cannot do in other languages so I just want people to forget this idea that ColdFusion is dying. It is very alive and it can be used as a modern development language and you know lets you write amazing applications fast. So that brings me to my penultimate question which is, what would it take to make ColdFusion more alive this year?
Ed: So I've actually given this a lot of thought and over the years. There have been many developers that we've all come to know as pleading voices in ColdFusion who've contributed to the community via podcast, blogs, open source projects, presentations even conferences. Because some of them have moved on to other things doesn't to me means that ColdFusion is dying instead it means of ColdFusion is evolving as opportunities arise for other developers to step in and participate. Some of these developers are new to the community, but some are long time ColdFusion developers such as myself that just taking a little longer to find their voice.
Because speaking may not be for everyone there's other ways I think that developers can contribute to help make ColdFusion worldwide by simply attending user groups and not just ColdFusion specific groups but non CF groups as well. So I would tell people don't be shy or embarrassed to let others know that you're proud used ColdFusion, some developers may give you a little bit of grief, but if you can get past that they will start to be curious when you start showing them what ColdFusion can do, you can help these other developers understand that ColdFusion is a modern language just as capable and alive as whatever language they're using. I've got another way that my employer has helped to make CF more alive is by hiring and training junior developers.
So our development team needed rapid growth and he had difficulty finding senor level CF developers locally who were interested in leaving their current positions. Because we were unable to hire remote developers either we decided to create a training program over the last two years we've brought on board 7 new junior developers who were quickly trained and every one of them has been highly successful within our team. In fact these junior developers have help rejuvenate many of our senior level developer’s passions and brought in new ideas. I think the development team that we have there is experiencing a renaissance, but I think can easily be duplicated at other companies.
Michaela: I think that's an excellent idea and you and your company deserve a CF alive Oscar for taking initiatives there, and not just complaining that you can't hire but doing something positive about it that grows the community and also helps your existing ColdFusion developers feel better. So let's turn to what you're looking forward to at Into The Box this year?
Ed: I'm looking forward to speaking at the conference or at any conference. To be honest, I've given presentations to user groups and us several development teams, but this is going to be my first speaking engagement at a conference, so I'm really excited about that, but I'm also looking forward to learning about a variety of topics, particularly DDT which is very closely related to my topic and then also meet new people and catching up with some old friends.
Michaela: I go to and say that Into The Box is amazing conference last year and lots of cool talks, interesting people, you've got lots of time to chat with speakers and attendees in between sessions and of course there's also the happy box special event which I believe probably involves [inaudible] [44:39] band at some point. So it's a lot of fun and I think I saw an announcement today that there's only 15 seats left in the conference, is almost totally sold out so congrats to the organizers office on that and looks like an exciting event. So if people want to find you online, Ed what are the best places to do that?
Ed: Well the key to finding the online is the name. My website is EdBartram.com and I'm on Twitter @Bartram and in the CFL workspace on Slack at Bartram
Michaela: And just the people listening Bartram is spelled with two ‘A's' even though it's pronounced as though they had a ‘U' in it. We'll put those links in the show notes on the TeraTech site for this broadcast episode. So well thanks so much for coming on the show Ed.
Ed: Thank you Michaela. It's been a real pleasure and I want to thank you for
all you've done for the ColdFusion community. There's a lot of quiet developers out there we may not hear from your efforts throughout the years are very much appreciated.
Michaela: Well thanks for saying that and if anyone listening wants to come on the show because they want to go from quiet developers to they sharing their cool ColdFusion stories, please send me an email or message me. Somehow always happy to get new guests on the show and I'm delighted you came on the show. This is your premier time on the show and I'm sure it won’t be the last time.