Testing web sites in multiple versions of Internet Explorer

Ie8logo1

There’s nothing more frustrating running  a web site only to receive complaints that it “doesn’t work” in some specific version of a browser. The major culprit almost always seems to be Internet Explorer – I’ll leave the philosophical debate why this might be the case for another day.

The issue came up for me because I was messing with a little personal project the other night that uses the Bootstrap 3 framework. I tried the page in IE11, with IE8 emulation, and the pages didn’t work correctly – despite being on a full size screen, the pages were acting all ‘responsive’ on me, and rendering the mobile versions. I then read somewhere that IE8 emulation on IE11 is not accurate – pretty useful! It seems IE11 in IE8 emulation mode does not respect conditional statements – which was what was messing with my responsive design. It was not handling the media queries include that’s needed to correctly render modern CSS in an IE8 browser.

Of course the problem here is I don’t have IE8 on a real machine available to me anywhere.

However, help is at hand, with thanks to Microsoft, which has now made available pre-packaged virtual machines for many combinations of Internet Explorer and Windows OS.

Providing you have one of the virtual machine software setups, for example Virtual PC on Windows or Parallels on Mac you are in business. I use Parallels and here are the VM configurations available to me – it’s pretty wide, there’s even IE6 on XP, but also IE8, IE9, IE10 and IE11 on Windows OS  including Vista, Windows 7 and Windows 8.

Screen Shot 2014-03-07 at 8.03.59 AM

Downloading and setting up the VMs is a snap, within a few minutes I had a fully functional version of IE8 running on Windows 7 in a window on my Mac – and 10 minutes after that had diagnosed the problem with my web site and applied a fix. Note, if you are using Parallels on Mac there is some helpful information on the Parallels web site about configuring the VMs, starting with the fact the file you download has the wrong file extension – you need to manually update the file name before Parallels will recognise it as a VM.

Classic ASP – maintaining sessions between secure and non-secure pages

If you use secure (SSL) and non-secure pages on your IIS-powered web site, and are using session variables to hold information about your users, you most likely will find those session variable values disappearing as your user switches from non-secure to secure pages – for example, they are in your online shop, and then click to Checkout and Pay.

The problem is IIS – by default it creates a new sessionId for users when they hit the SSL pages – so any session values you already created against the non-secure page sessionId are lost.

It’s easy to fix, in IIS7 anyhow. In IIS go to the properties for your web site and open the ‘ASP’ properties page. There is an option down the bottom entitled ‘New ID on Secure Connection’. By default this is set to True. Change it to False and click the Apply link.

https_session

Switching between SSL and non-SSL can also be a reason users experience a time out, or seem to be automagically logged out in the middle of something.

Check the page set up on your browser to make sure it prints text as black – NOT!

Here's an easy, blatant example of a web site with a built in problem, that the owner acknowledges but who has decided to make it the user's problem.

I picked up a parking ticket the other day and just went to pay the fine. It was with the City of Port Phillip, which like a bunch of councils around Melbourne uses the maxi.com.au online payment system (I remember Maxi launching years ago amidst much hullabaloo, ideas like terminals all over the city etc etc. Nowadays it still exists but only for these small irritating payments).

Anyhow, I dutifully pumped my Visa card number in, and back came the receipt page, which is a classic fail. Here's the page:

Maxi

The text I've highlighted says:

(If you print your receipt, check the page set up on your browser to make sure it prints text as black.)

So of course, red rag to a bull, I clicked the print button, yup, an entire page of dark blue background plus the white text.

They KNOW it's a problem, but they put it onto the user to solve the problem. Hands up everyone who knows how to change the page set up on their browser so that it would reverse or otherwise deal with this ludicrous situation? I do this for a living and I'd probably fiddle for 5 minutes.

Yet the issue is so incredibly simple to fix – just change the styling on the page to a white background and dark text. In fact, I'm positive the last time I used Maxi this issue didn't exist – which means they've updated the site and created a new problem.

 

How to host IIS7 web site on Mac OSX folder via Parallels

This has bugged me for a long time. I run Parallels on my MacBook Pro because I often have to work with Windows programming technologies, mostly Visual Studio and SQL Server. When I'm developing in Windows I run a localhost copy of the web site in IIS7 (I have Vista on my Mac). Up until now I've kept the site files in the Parallels VM windows drive, but this has constantly annoyed me because I don't back up my Windows drive with Time Capsule. Take too long and causes a marked slow down in my system each hour when the backup is running. Parallels has a specific option to not back up the VM with Time Capsule and I have that selected.

Every time I've ever tried to hook an IIS web site to a folder on my Mac (eg not in the VM) it's never worked – endless errors. Finally after a bit of hunting around I pulled together various tidbits of information from the net and made it work. Here's how:

1. In Parallels configuration make sure you are using Bridged Networking, this is to ensure your VM has a separate IP number to your Mac.

2. In Mac System Preferences > Sharing, enable File Sharing. Add the root folder for your web site (on your Mac) to the list of Shared Folders and give all users all rights (I kept it simple with permissions and just gave everyone every right, it's only for my localhost so I don't think security is a big deal).

3. In Windows use File Explorer and open Network and locate your Mac and the root folder for your web site. Copy the path, on mine it looks like \\MACBOOKPRO-31F6\wwwroot

4. In Windows IIS add a new web site, In Advanced Settings set the Physical Path to the network path you copied above

5. Still in Advanced Settings set the Physical Path Credentials to your Mac user name and password. I wound up adding a new Windows user with the same user name and password as I have on the Mac. I'm not completely sure if this is the right thing to do, but heck, it worked.

6. If you already have a default web site in IIS, you'll need to sort out the bindings, you can't have more than one web site on the same port, which by default is 80. Right click the new web site and edit the Bindings. I set the port to 8080.

7. In Windows open your browser and try http://localhost:8080/, you should see the default page, if you have one, from the folder on your Mac. If, like me, it didn't work first time, try creating a simple hello.htm and calling that. I discovered my problem was simply paths and configuration variables in my web site – moral being try calling the most basic page first before blaming errors on IIS rather than your web app.

activePDF ToolKit Error Code -998

So I don't have to yet again hunt down the activePDF ToolKit documentation, if an Error Code -998 is returned by activePDF ToolKit it means 'Product not registered/ Evaluation expired.'. And you have to take up arms and prepare to enjoy chasing down why your previously registered copy has decided, perhaps in a fit of pique, that it's happier unregistered. Which is not a fat lot of help when the bloke who has the serial number database is in a timezone slumbering through the middle of the night, it's the midst of a workday for you and you promised someone something for tomorrow.

I completely agree with software licencing. It's how companies legitimately make money. But some seem to have registration management systems calculated to drive us mere code crunchers up the wall.

 

The Dark Arts

I just finished writing my previous post, and decided to try and find something I'd written to a client in the past about their SEO. With minor edits to conceal the guilty, here's an extract from an email to a web development client of mine last year:

The obvious questions are:

a) Would you achieve those rankings
without paying $400 a month? EG If we had a decent site, with decent
relevant copy and content, you should be up there anyway.
b) How much $ sales can be ascribed to the SEO
campaign – eg how many sales have you achieved from click thrus from
the search engines? That presumes the buyer behaviour is that they
might know about companies offering your product, they Google some words,
up comes your site, so they click thru, and then buy.

2. I'm not entirely happy with the SEO work
on your web pages. By my reading it contravenes Google's guidelines. It
includes hidden text, hidden links that are solely present just for
search engines. These are tactics expressly banned by Google. See http://www.google.com.au/support/webmasters/bin/answer.py?answer=66355.

If you weren't aware of the hidden stuff. try going to your site,
then click View > Source. You'll see all the HTML code, scroll down,
there's a bunch of stuff that, using some trickery, is not visible to
your customers, but is visible to the automated search engine spiders.

It might work, but the problem is if Google (in particular) catches
you. You'll be dropped from their index then need to fight to get
reviewed and reinstated.

This is why I have a problem with SEO
like this one. It's interfering with the natural order. Sites appear in
Google in the order that Google's algorithms has determined based on
relevance and popularity. So if you have a good site, with great
content, that people want to visit, you'll deservedly rank well. SEO like this relies on tricks and manipulations to circumvent the natural order.

Good site optimization should be about creating a great site, not messing with mother nature.