Software dev, tech, mind hacks and the occasional personal bit

Author: James Page 7 of 20

Presentation Zen by Garr Reynolds

presentationzen.jpgAfter being impressed by Garr Reynolds speaking in Sydney a year or so ago, I’ve been keen to check out his Presentation Zen book. It is an enlightening read, especially if you have never studied art or graphics design. The book is a little over 200 pages long, with many illustrations and a impressive, clean layout (no surprise there!).

Near the start of the book, Garr talks about creativity requiring an open mind (child like) and a willingness to be wrong, and to experiment. He recommends exercising restraint, and focusing on simplicity, clarity and brevity. He starts presentations brainstorming using pen and paper, whiteboards or post-its rather than in front of the computer (personally I often use story cards as you can jot slide outlines on them, group, and shift the order around). He recommends grouping the ideas, and identifying the core message and sticking with that message throughout the whole presentation.

Garr highlights the importance of taking the time to slow down and really think about what to put in the presentation. He suggests that you keep two important questions in mind: “What’s your point?” (what one thing do you want the audience to remember), and “Why does it matter?” (put yourself in the audiences’ shoes). If bits of your content don’t aid in answering these questions, “when in doubt, cut it out”! Garr also suggests an “Elevator test” – can you make your pitch in 30-45 seconds? A structure that works well is starting with an introduction which explains the issue (the pain) and the core message. Then something like 3 parts that support your assertions or solve the pain (sounds a bit like Bosworth’s Solution Selling).

“Amplification through simplification” is central to Garr’s design approach. He advocates lots of empty space to highlight just one or a few important elements on a slide. “Simplicity can be obtained through the careful reduction of the non-essential” and decreasing the signal vs noise ratio of the slides. Garr is a big fan of using images on slides with just a few words. The aim is to make slides which have strong, memorable impact, and enhance the presenter’s spoken words. He also highlights the importance of having the audience know where to look. Eg, simplicity plus images leading the eye to the right spot (eg, people in images on the slide look towards the words on the slide). Garr is a big fan of using quotes to support his points.

Garr suggests a mix of symmetrical and asymmetrical slides. Symmetrical are more formal and static, where as asymmetrical slides are often more dynamic and interesting and activate empty space. He also suggests using a grid, such as the rule of thirds (2 horizontal and 2 vertical lines providing a grid of 9 equally sized boxes), with the main subject placed on one of the crossing points of the lines. Contrast (using colour, shape, space, etc) can be used to make an element stand out and helps the viewer “get” the point of the design quickly. Repetition can be used (eg, text on each slide in an image of a post-it) to provide a professional and unified look. Use proximity to group related objects.

Although Garr doesn’t talk about it explicity, his sample slides tend to make use of clever typography. Often lower case text, with most important part in a bigger font. A mix of colours and sizes and styles and sometimes rotations to add interest to the slides. Generally sans-serif fonts.

On presenting itself, Garr says you should be completely present – enthusiastic and completely focused on presentation that you are giving, lost in the moment. Nothing else. Although you may make mistakes, don’t dwell on them. Practice like mad to become confident and appear easy and natural for the presentation. However, remain flexible, aware and open to possibilities as they arise (being “in the moment”).

Near the end of the book, Garr says: “It’s not about us [the presenter], it’s about them. And about the message.”. He also suggests that shorter is better, leave the audience wanting more, not overloaded (as per Japanese proverb “eat until 80% full”). On delivery, Garr suggests standing front and centre, leaving the lights on and advancing slides with a remote.

Garr’s points are much more clearly illustrated using images in the book. I would recommend Presentation Zen to anyone who is interested in making more visually inspiring and interesting presentations.

Percent Number in Apache Rewrite Rules (mod_rewrite)

What do the %1 %2 in a Rewrite rule mean? The Apache guide does not help, nor does any other documentation I found. I came across the %1, %2 etc in some complex and arcane rules. Google ignores percent signs, which makes it hard to get an easy answer.

We’ll use the rules from my last post as an example.

RewriteCond %{HTTP_HOST} !^www\. [NC]
RewriteCond %{HTTP_HOST} ^([a-z.]+)$ [NC]
RewriteRule ^/(.*)$ http://www.%1/$1 [R=301,L]

The %1 refers to the capture group in a previous RewriteCond. This differentiates it from the $1 which refers to a capture group in the current RewriteRule.

Hopefully I have littered this post with enough keywords that future googlers will find the answer to the %1 %2 in ReWrite rules more easily 🙂

Adding WWW to domains, and Apache Rewrite Rules (mod_rewrite)

Browse to http://google.com. Then look at the address bar. You’re not really at http://google.com. You’ve been redirected to http://www.google.com. Try the same on w3c, Facebook, Sydney Morning Herald etc.

Why WWW?
Why do all these sites redirect you to a www form? Well, the main reason is because it is advantageous to have a canonical URL, and, if your have to choose one URL, you might as well go with what people seem to expect, which is to include a ‘www’.

What’s so great about having one canonical URL?

  • Cookies: if your users can access the site at www.domain.com and domain.com, you can end up with some horrible cookie and session problems depending on the browser and web framework (behaviour is different between Firefox and IE). Stay tuned for another post with more details on this.
  • Certificates for HTTPS: certificates are usually for a single domain. If your site is available with and without ‘www’, your site will need a certificate for each or a multi-domain certificate (ie, more money and config).
  • Caching: if you have two URLS, any HTTP caching will only be half as effective
  • SEO: your page rank may be split between links to both possible URLs (though Google Webmaster tools seems to let you combine it)

How?
Right so, now you’re probably just hoping there is an easy way to implement this forced ‘www’ business! Well the good news is that it’s quite easy if you’re using Apache with mod_rewrite. I googled around to try and find some good rules, but the ones I found were tied to a single hard coded domain (no good for me where I have multiple domains pointing to the same server for different countries). See below for what I came up with. It seems to work quite well. You can put it in your virtual host configuration file or even .htaccess file.

RewriteEngine on
RewriteCond %{HTTP_HOST} !^www\. [NC]
RewriteCond %{HTTP_HOST} ^([a-z.]+)$ [NC]
RewriteRule ^/(.*)$ http://www.%1/$1 [R=301,L]

Line 1: Are you coming to the site without www. at the start of the host? [NC] means ignore case.

Line 2: Does your domain comprise of letters and dots (this means that going to the IP address will not fire the rewrite rule). Grab the domain in a capture group.

Line 3: Rewrite the URL with a www at the front, and keep the hostname from the previous condition (%1) and the path after the domain ($1). Use a status code 301, to tell the client that this is a permanent redirect.

Po: Beyond Yes and No by Edward de Bono (Book Review)

A few months back, I came across Edward de Bono’s book on Po at a local post office second hand book sale. I decided to risk 50c and buy this out of print, 1972 edition book on creativity and lateral thinking. It was worth every cent 🙂

Until you get a fair way into the book, it’s quite hard to work out what it is about. It is also quite wordy, and oddly organised. However, after reading it for a bit, I found it had some interesting ideas.

De Bono is not a big fan of the yes/no system or argument. He proposes that yes/no mindset that people usually use means that somebody has to be right and somebody wrong. With this mindset, an old theory cannot be replaced by a better one until it can be proven wrong by argument. For subjective subjects, this is not often possible. He proposes that when people have a “right” answer, they are happy and stop looking for a better answer, curbing creativity. Similarly, a “wrong” answer stops that train of thought – and perhaps if it had continued, then a good answer might have been found with ideas triggered from the “wrong” answer.

De Bono sets up PO as an alternative to the Yes / No system and talks about it as a way to break down established patterns and introduce discontinuity in thinking to come up with new ideas. He sees it as an alternative to the “clash” of argument and the “arrogance of logic” in the “closed and highly artificial world” of education, that in later life leads to a “need to be right”. He says that this “need to be right” then leads to people “defending not the idea, but your self-esteem” and having high resistance to new ideas and change.

De Bono disputes a common idea that by choosing the best answer in a series of questions or steps leads to the optimal solution at the end. He shows several examples where choosing the most optimal answer for each step leads to a solution which is not optimal.

Arguably the most interesting part of the book describes a number of tools for lateral thinking.

PO-1: Intermediate Impossible
Rather than immediately rejecting an impossible idea, look at it longer for good points. Reconsider your framework of judgment and concept package – maybe idea is right if you consider the situation in a different way. The idea can be a stepping stone to a better idea. When other people come up with a “wrong idea” listen longer and see where it can take you. This approach can be used as a tool – turn the “idea upside down, inside out, back to front” and “say the most unlikely and outrageous thing you can about the situation – and see where it gets you”.

PO-2: Random Juxtaposition
“When you have exhausted the different ways of looking at the problem from within, you bring in” a random word “in order to generate a fresh approach” through juxtaposition and connecting the words. The random word can be from opening a dictionary at random or from a list of “idea provoking” words.

PO-3: Change without rejection, by-passing old concepts to generate alternatives
“That idea is fine, but let us put it on one side and find a new way of looking at things”, “this is one way of looking at things and it is perfectly valid but it does not exclude other ways, so let us try to find some” or “I wonder if there are other ways of looking at this”. “Why do we have to look at things that way”, lets reconsider our starting point and understanding.

The last part I want to mention is the discussion of retardant doubt. De Bono suggests that with a Yes/No, boolean mindset, you require certainty of being right before acting. If you don’t have this certainty, your doubt holds you back. You may even create false certainty so that you can act (leading to problems later since you’ll then defend this false certainty). However, in the Po system, there is no certainty. The premise is only that the “current way of looking at things is the best one at the moment, but may need changing very soon”. This means you can act without certainty – your action might not be right in the absolute sense, but you are ready to “change it as soon as circumstances demand”. With the Po approach you explore a wide range of alternatives, choose the most effective idea for now, but be ready to change it for something even better.

Overall, I enjoyed the book (though skimmed some more repetitive bits) and plan to try out some of the lateral thinking tools. If you want to get the book, a second hand bookshop is probably a good option. It is quite expensive on Amazon.

Korean Tree On Life Support

Tree on a drip in Busan

The IV drip was for real, on quite a few trees. Some sort of sucrose solution, presumably going into the phloem.

Korea Trip 2009

Soosun and I are just back from a few weeks in South Korea. We had a good time, with many friends and relatives to catch up with, but also some time to travel around.

IMG_7947

At the start of the trip, I went to the DMZ (the “demilitarised zone” at the border with North Korea). It’s not very far from Seoul, and is only open to “foreigners” (not Korean citizens). It was quite an interesting place, and not a little scary, with mine fields, tank traps, North Korea soldiers and stories of massacres (including one with an axe, over tree pruning) and previous gun battles over defectors. The high point of the DMZ tour was going to the tunnels dug by North Koreas for invasion and the JSA (joint security area) where I briefly stepped over into North Korean soil. The South Korean guards are all chosen for their size and height and stand in a modified Taekwondo stance. Most places we weren’t allowed to take pictures, but here is one of the negotiating table which is half in North and half in South Korea, some protective fencing and the North Korean size of the JSA, including a North Korean soldier.

JSA

Fencing

North Korea

I also went to see some Dinosaur footprints at Goeje. Quite a find, with an interesting museum as well.

We spent a few days at the end of the trip in Jeju Island, which is Korea’s most tropical area. There are a lot of beautiful spots, activities and theme parks spread around the island (which is big enough to need a car but small enough you don’t have to drive for too long). There is a tall mountain to climb in the middle (about 20km return, 2km up), and some pleasant beaches as well.

More Korea photos here.

Slides & Code: Securing your MVC site against Code Injection and X-Site Scripting

Here are the slides and code from yesterday’s talk at Sydney ALT.NET.

See Steve Sanderson’s post for the code/binary for subclassed aspx compiler and more information about the automatic encoding approach we covered in the talk.

Windows / .NET Dev Tools

Recently I visited a .NET dev team to take a look at design, code and processes with a view to making recommendations to improve delivery speed. One of the more minor, but easily generalisable areas is around tooling. I often find that the little extra tools you pick up can make your work significantly more efficient. Here are a few free ones I use:

KDiff3
A brilliant merge tool that plugs nicely into TFS or SVN. SVN integration is automatic from the Kdiff3 installer. TFS integration is manual, but quite easy.

Console2
A tabbed console which works well with classic windows shell and powershell. Good support for resizing, copy paste, etc.

.NET Reflector
.NET decompiler for those dlls that don’t have source. There is also a great plugin that lets you decompile entire assemblies to files on disk.

Fiddler
When you’re debugging SOAP or RESTful web services, Fiddler is great. It lets you see the messages sent / received and even change and impersonate them.

QueryExpress
If you’ve got SQLExpress or just no tools installed, QueryExpress is a tiny (~100K) and quick query analyser style application for all breeds of MS SQLServer. Download in a few seconds, and be running queries before a minute is up.

Unlocker
Don’t you hate it when Windows gets its locks in a mess and you can’t delete/rename files? Unlocker will automatically pop up, show you which applications are holding file locks and let you release the locks.

Process Explorer
A more powerful and accurate Task Manager application which allows you to see file locks and many other types of information.

Talk: Securing your MVC site against Code Injection and X-Site Scripting

I’ll be giving a lightning talk on securing your ASP.NET MVC site against code injection and x-site scripting next Tuesday 25 August at the Sydney ALT.NET group. I’ll be demonstrating potential pitfalls and dangers of arbitary code injection, and how you can protect against it, elegantly. We’ve got 6 interesting talks lined up for the night. See you there!

The Fallacy “Best of Breed” in Layered Solutions

Imagine you are designing a layered solution where data enters in a GUI, and passes through several layers for transformation and processing before being written to a database. Everyone knows that layering is a good way to do decomposition, right? It means you can work on each layer separately, without affecting any other layer? It means we can even hand off each layer to a separate person/group/company and different hardware to handle? This is all looking so good, now we can choose a “best of breed” solution for each layer. If each layer chooses the technology and implementation group that is the very best for that sort of work, it must lead to the very best solution overall, right?

Well, data needs to flow through all of the layers in this sort of design. Lets take an imaginary example. Say one layer in the middle has a data field length limit of 255 characters. This means that every layer is then limited to this data length, otherwise the data will be truncated or rejected on the way to/from storage. Instead of getting the advantages of each layer’s solution, you end up being limited to the lowest common denominator of all the layers.

A further problem is staffing and team structure. If each layer has chosen a very different “best of breed” technology, it will be difficult to find one team/company/group that can handle all of the layers (eg, Java front end, BizTalk middleware, Mainframe backend) and do vertical slices of functionality. Of course, you need the “best of breed” for the staffing! Hence, implementation is often split between different teams/companies (horizontal slicing of teams), each of which is known for skills in a particular layer. Although each team may be “best of breed”, we end up with the lowest common denominator again. Methodologies are likely to differ between teams (eg, waterfall vs agile) so the communication and planning is limited to the area of overlap between methodologies. The same applies for project goals. For example, one team may focus on user experience and another may focus on building an enterprise wide data model. It is only where/when these goals intersect that the project can progress efficiently.

What can we do to defuse this sort of architectural design in its infancy? Questions to ask:

  • How many times is the same data transformed, and does each transformation add value?
  • Can multiple layers be hosted in the same process rather than split between different machines/processes?
  • Integration is always time consuming. Do the “best of breed” advantages of a solution for a particular layer outweigh the cost of cross process or cross technology integration?
  • Can one co-located, multi-disciplinary team be formed to build the solution?
  • By comparison, how many people would be required, and how long would it take to build the application with the simplest architecture that could possibly work?

 

Page 7 of 20

Powered by WordPress & Theme by Anders Norén