Sunday, July 24

Hi, sorry about the delay in posting Friday's assignment, I went out of town for the weekend. Here is what I need you to do:

1. Read the chapter I handed out in class on Wednesday on revising for readability.

2. Go through your existing drafts and find 10 sentences that need revising

3. Post them to the discussion area under "revising for readability"

4. Identify by name recurring problems that you see in your writing, based on the included sentences

5. revise the affected sentences.

Thanks!

Saturday, July 16

Ask E.T.: Rhetorical ploys in evidence presentations: "Rhetorical ploys in evidence presentations

Here is the beginning of a collection of rhetorical ploys in evidence presentations, verbal moves that replace real evidence.



FAUX CONSERVATISM This takes the form of 'Our results are conservative; we made conservative assumptions about the model.' The claim is that possible biases in assumptions work against the view advanced by the researcher. This is in fact an implicit error statement. Such claims are sometimes a rhetorical tactic that substitutes verbal fudge factors for quantitative assessments of error. See for example the Boeing/Columbia slide with the headline: Review of test data indicates conservatism for tile penetration



IGNORING SELF-CONTRADICTION See the Boeing slide. See also Richard Feynman's example in his Challenger report at page 137 of 'What do you care what other people think?':"

Friday, July 15

For this Friday's online class I'd like you to do peer review. Your response should be at least 150 words, and detailed.

1. Post your current draft AS AN ATTACHMENT with the subject:


DRAFT: [your last name]

2. Respond to any draft that hasn't gotten a response yet, but only after you have posted your own draft. In the response message, (R)eply to the original with the subject:

RESPONSE: [your last name]

Hopefully this will be less confusing than the last time we tried this!

Monday, July 11

http://www.alfiekohn.org/managing/ipoa.htm
Attack of the Weasel Words - Newsweek Entertainment - MSNBC.com: "What are the “weasel words” you dislike most?

'Implemented.' You'll see implemented everywhere. In this language, you “implement” rather than speak or do. And then there is enhanced. Everything is being enhanced. That word is being used in place of other more precise and descriptive words. You can enhance your marriage or your job. You can even implement your enhancements. And 'input' is another good one. Companies talk about “input into our people.” This reflects technology and accounting [ideas]. It all has to do with input and outcomes."

Friday, July 8

For today's online class (Friday, July 8th) I'd like you to do two thing:

1. Discuss the recent readings on usability, and usability in general, on the discussion board. Post at least a paragraph discussing the readings, and respond with at least a paragraph to someone else's posting. Some questions you might consider answering:

a. Can Neilsen's web criteria be effectively applied to paper documentation?

b. Does "usability" really matter, or is it just an annoying piece of jargon?

c. How on earth could this possibly apply to your jobs in the real world, if you don't become a technical writer?

You can probably come up with better questions on your own.

Ok, the second thing I need from you is to take that article on applying Neilsen's criteria to paper docs, and, using that checklist, design a simple usability form that your test subjects can use to evaluate the documentation you are bringing to class on Monday. Post your form to the discussion area for other class member's to get ideas from. It's ok if you plagiarize each other like crazy in this assignment, and you can post multiple drafts if you take an idea from someone and improve your original. try to keep it to one page, if possible.

Wednesday, July 6

Heuristic Inspections for Documentation: "We all are familiar with Jakob Nielsen's heuristics for evaluating the usability of interfaces. When I was conducting a study on documentation usability, I started wondering if there existed a similar set of heuristics for evaluating the usability of documentation. The natural place to pose such a question was the STC Usability SIG mailing list. The response was that there was no heuristics set available although someone had tried to open the discussion in the mailing list some time ago. An answer, which led to the list of heuristics presented below, was something along the line 'Well, now that you asked, why don't you put the heuristics together' and so I did.
Guardian Unlimited | Online | Lazy, stupid and evil design: "Lazy, stupid and evil design "

Friday, July 1

Take the MIT Weblog Survey
Dear students,

I have lots of students right now, and I get tons of email. It would be really helpful if, when sending me a message, you put your course in the subject heading, like this:

Subject: [1010] Question about final paper

Of course, if you're in 2020 or 4310, you would put [2020] or [4310] respectively.

I'm filtering my email now, so if you *don't* follow that example I'll probably lose your note, or not get to it for several days.

Thanks!
About section page template: "The Usability Toolkit is a collection of forms, checklists and other useful documents for conducting usability tests and user interviews."
Heuristic Inspections for Documentation: "We all are familiar with Jakob Nielsen's heuristics for evaluating the usability of interfaces. When I was conducting a study on documentation usability, I started wondering if there existed a similar set of heuristics for evaluating the usability of documentation. The natural place to pose such a question was the STC Usability SIG mailing list. The response was that there was no heuristics set available although someone had tried to open the discussion in the mailing list some time ago. An answer, which led to the list of heuristics presented below, was something along the line 'Well, now that you asked, why don't you put the heuristics together' and so I did.

"
Improving User Documentation and Customer Care: "Improving User Documentation and Customer Care

by Cem Kaner, Ph.D., J.D. & David Pels, B.A.

In recent years, the Customer Care Survey of Service and Support Practices in the Software Industry has consistently reported that only about half of software publishers put their documentation through a formal test. We thought that these numbers were low, so we checked them at the Software Testing, Analysis & Review (STAR) conference (Orlando, May 16, 1996).

During a plenary session, Kaner asked attendees (software testers) whether their groups tested their companies’ user manuals. Confirming the Customer Care data, at least half the room stood up to signify that their companies did not. This means that reputable companies are not testing their manuals – companies who don’t care about quality don’t spend money to send testers to STAR."