Serious Word Document Problems - Please, PLEASE Help! :(

S

Sikerra

Hello there. I am a fourteen-year-old who loves to write; and suddenly
finds that the stories saved on her mac are somewhat inaccessible!!
Here are my problems laid out as best I can make them:

Opening Directly from Word

- Although I can open up word and create new documents, type, etc., as
soon as I go to File--Open, word locks up and I have to quit the
program. (The rest of my computer is unaffected.)
- However, by going to File and then viewing the drop-down list at the
bottom of the menu that recalls the last few documents I had opened,
before this problem occurred, I can open any of THOSE documents; and
those documents only. No others.

Opening Directly from the Desktop

- If I try to open a document from the desktop by double-clicking on
it, word locks up.

Saving From Word

- If I go to 'Save As', word locks up. (But I can 'Save.')

This problem has happened to me twice before. Both times, word began
working properly once more after a certain amount of time. But now the
problem's back, and won't go away.

The documents seem to be perfectly in tact - I can open them with text
edit, and they're not changed in any way.

I have no idea what could be causing this. I have not opened up any
strange e-mails (or ANY e-mails, for that matter) in about two months;
have not been to any new or different websites; so I don't think this
could be a virus. And only Word is having these issues.

Please, please help if you can; not being able to save any new
documents or open any old ones up without Word freezing up is a huge
problem. :(
 
C

CyberTaz

Hello-

Although your post is quite detailed & well presented, you left out some of
the most important detail... Which version of Word & Mac OS are you using?

Assuming OS X, the first suggestion would be to use Disk Utility to Repair
Disk Permissions on your hard drive, then log out & log back in and see if
the problem persists.

Post back with your results if necessary & include the version info.

HTH |:>)
 
J

JosypenkoMJ

It sounds like your copy of Word has had holes punched in it (usually
by another program), or has been corrupted in some other manner.
Usually when a program gets this bad, I resinstall it or delete it and
copy a backup of it (and all of its files). Resinstalling will rid any
modifications you may have made to Word, such as custom windows and
macros. You may wish to try save these first before reinstalling.
 
J

JE McGimpsey

It sounds like your copy of Word has had holes punched in it (usually
by another program), or has been corrupted in some other manner.
Usually when a program gets this bad, I resinstall it or delete it and
copy a backup of it (and all of its files). Resinstalling will rid any
modifications you may have made to Word, such as custom windows and
macros. You may wish to try save these first before reinstalling.

It's *very* unusual for a Mac application to corrupt or be corrupted by
another application. Simply deleting and reinstalling will likely NOT
change the behavior - corruption is Word is usually in the preferences
and Normal template, neither of which is touched during a
Delete/Reinstall (well, for Word 2004, which stores the Normal template
in the MUD folder; earlier versions store Normal in the application
folder, so deleting the entire Office folder may cure the problem by
deleting the Normal template).

Delete/Reinstall is often effective for Windows machines, as the
reinstall can overwrite damaged registry keys. The same thing doesn't
happen on Macs - unless you delete the preferences, the reinstallation
will behave the same as the old installation.

If one wants to Delete/Reinstall Office, it's best to use the Remove
Office application, which will remove preferences and hidden files (but
not anything in the MUD folder). For Office v.X and Office 2004, the
application is on the installation disk. For earlier versions, the
application should be downloaded from MacTopia.
 
J

JosypenkoMJ

JE said:
It's *very* unusual for a Mac application to corrupt or be corrupted by
another application. Simply deleting and reinstalling will likely NOT
change the behavior - corruption is Word is usually in the preferences
and Normal template, neither of which is touched during a
Delete/Reinstall (well, for Word 2004, which stores the Normal template
in the MUD folder; earlier versions store Normal in the application
folder, so deleting the entire Office folder may cure the problem by
deleting the Normal template).

Delete/Reinstall is often effective for Windows machines, as the
reinstall can overwrite damaged registry keys. The same thing doesn't
happen on Macs - unless you delete the preferences, the reinstallation
will behave the same as the old installation.

If one wants to Delete/Reinstall Office, it's best to use the Remove
Office application, which will remove preferences and hidden files (but
not anything in the MUD folder). For Office v.X and Office 2004, the
application is on the installation disk. For earlier versions, the
application should be downloaded from MacTopia.

I normally don't even want to think about and deal with the nonsense of
what Word is doing when it is corrupted (corrupt files here, there,
everywhere). It is far easier to treat it as any corrupted program and
simply reinstall it (although from what you said MS has changed the
word install to mean partial install - no preferences. Not many these
days know how to call a spade a spade). I don't even install - I just
recopy all the Word files from a backup.
 
J

Jeff Wiseman

I normally don't even want to think about and deal with the nonsense of
what Word is doing when it is corrupted (corrupt files here, there,


I normally "don't want to think about and deal with the nonsense
of what Word is doing" either :) Unfortunately, if you want to
use it successfully, you will have to learn some things about it...

I believe that what JE McGimpsey was inferring (and I agree with)
was that "Word" is not corrupted. Some files that it uses (e.g.,
user preferences) have been corrupted and through Word's normal
everyday buggy unstable behavior, it chokes to death when it
tries to open such a file.

everywhere). It is far easier to treat it as any corrupted program and
simply reinstall it (although from what you said MS has changed the


Because of the way OS X stores files, it is very unlikely that
the application software itself is corrupted. The need for
reinstalling programs of any sort due to corrupted applications
the way you would in Windows or even early days Mac OS programs
is pretty rare. In fact, doing so with many applications can
actually introduce problems if it is not done exactly correct
taking into account the correct removal of any earlier installs
(Office for Mac is one of them). That is why installers are used

word install to mean partial install - no preferences. Not many these
days know how to call a spade a spade).


Installs, in general, do NOT diddle with your preference files.
If they did, everytime someone did an upgrade, they would have to
reset absolutely everything which many people would object to
(especially since it could include having to reconfigure all the
styles and such in their Normal file). Especially on a multi-user
system like OS X where you could conceptually have 40 different
users all with their own preferences set up. Is an install
supposed to seek out and destroy the preferences of everyone on
the system? In general, no UNIX based application would ever do
this as it violates the separation of user configurables and
system configurables.

days know how to call a spade a spade). I don't even install - I just
recopy all the Word files from a backup.


Depending on how you go about this, it could be the worse
possible thing to do. Running a full recover on the entire system
may be OK. Copying from a backup can change permissions and
ownerships which can also create a nest of trouble.
 
J

JosypenkoMJ

Jeff said:
I normally "don't want to think about and deal with the nonsense
of what Word is doing" either :) Unfortunately, if you want to
use it successfully, you will have to learn some things about it...

I believe that what JE McGimpsey was inferring (and I agree with)
was that "Word" is not corrupted. Some files that it uses (e.g.,
user preferences) have been corrupted and through Word's normal
everyday buggy unstable behavior, it chokes to death when it
tries to open such a file.




Because of the way OS X stores files, it is very unlikely that
the application software itself is corrupted. The need for
reinstalling programs of any sort due to corrupted applications
the way you would in Windows or even early days Mac OS programs
is pretty rare. In fact, doing so with many applications can
actually introduce problems if it is not done exactly correct
taking into account the correct removal of any earlier installs
(Office for Mac is one of them). That is why installers are used




Installs, in general, do NOT diddle with your preference files.
If they did, everytime someone did an upgrade, they would have to
reset absolutely everything which many people would object to
(especially since it could include having to reconfigure all the
styles and such in their Normal file). Especially on a multi-user
system like OS X where you could conceptually have 40 different
users all with their own preferences set up. Is an install
supposed to seek out and destroy the preferences of everyone on
the system? In general, no UNIX based application would ever do
this as it violates the separation of user configurables and
system configurables.




Depending on how you go about this, it could be the worse
possible thing to do. Running a full recover on the entire system
may be OK. Copying from a backup can change permissions and
ownerships which can also create a nest of trouble.

I guess I don't need to worry about the Unix type of environment with
file permissions - OS is 9.2.2. Also not using shared software on a
network - environment is single user.
Before there were Mac's and then PC's, and third party software like
Word with dozens of files and individual installers putting files who
knows where (although lately there are installer logs), there were OS's
(eg. RT-11) where :

- there was only 1 installer, the system installer
- all programs other than user written ones were part of the OS. If one
got corrupted, such as the text editor, a person simply recopied it (1
file) from the original system disk - it would have been ridiculous to
rerun the OS installer.

I've had 1 situation of Quick Time 3.0 being corrupted every once in
a while on a 7300, OS 7.6._. PictureViewer would hang - replacing it
did nothing - I had to rerun the installer to replace everything.
... Unfortunately, if you want to use it successfully, you will have to learn some things about it...

Unfortunately, yes, althought an end user should not have to be
concerned about the inner workings of a product.
This is like a car manufacturer making a car with a loose timing belt
and expecting the driver every once in a while to have to take out his
wrenches from the trunk toolbox and reset the belt, after the engine
has skipped a tooth forcing the car to die in the middle of the road.
.... Is an install supposed to seek out and destroy the preferences ...

Yes. If I install a new alternator in a car, I definitely would not
want any pieces of the old one remaining, which may be broken, worn
out, or burnt out.
 
J

Jeff Wiseman

I guess I don't need to worry about the Unix type of environment with
file permissions - OS is 9.2.2. Also not using shared software on a
network - environment is single user.

Ooops. Sorry, I forgot that there might be non OSX Microsoft
Office installs out there. Your suggestions certainly have merit
under those criteria :)

I absolutely refused to put anything Microsoft on my Mac prior to
OSX because:

a) The number of required extensions in the system would
literally double and,

b) Most other applications would tend to break (probably due to
"a" above).

In OSX the runtime environments for the different applications
can be isolated a bit better so it's not nearly as serious as it
was on those earlier OSs
 
B

Beth Rosengard

I *loved* OS 9.1 and Office 2001, Jeff. It was simple to adjust memory
settings when/if things bogged down. And it was fast. The only reasons OS
X and Office 2004 are faster for me are that I'm using a G5 machine with 1
GB memory.

In fact, I was the very "last on my block" :) to upgrade to OS X and had to
be dragged kicking and screaming by my fellow MVPs. Just ask any of them.
Believe it or not, I didn't make the transition until just short of a year
ago!

Cheers,

Beth
 
J

Jeff Wiseman

Beth said:
I *loved* OS 9.1 and Office 2001, Jeff. It was simple to adjust memory
settings when/if things bogged down. And it was fast. The only reasons OS
X and Office 2004 are faster for me are that I'm using a G5 machine with 1
GB memory.

In fact, I was the very "last on my block" :) to upgrade to OS X and had to
be dragged kicking and screaming by my fellow MVPs. Just ask any of them.
Believe it or not, I didn't make the transition until just short of a year
ago!


Well this is the very reason that you SHOULD hang on to older
stuff--when it works :) I was using a Rev A iMac for 7 years
before I was forced to move on. One primary reason--when Apple
forsook the classic OS all development and bug fixing for classic
apps basically came to a halt including browser development. It
had gotten to the point that Java incompatabilities with my
browsers and my banking and job search sites were becoming
impassable. Then, on the very day I had transferred the last of
my legacy data onto my new machine, the networking hardware on my
old motherboard packed it in. There was no going back.

This is useful info for me though. During the OS 7.x-8.x history
I attempted on a couple of different occasions to install MS
products. They created so much instability and broke other
applications on my machine that I was using at the time. After
spending a LOT of time trying to get them functioning in a way
that was nice to the rest of the system, I gave up and had to
remove them. After such experiences and hearing of lots of other
similar troubles, it was very easy to blow off the Microsoft
applications as something to be avoided at all costs if possible.

Whether the problem was finger trouble or the applications
themselves, I lost all faith in the apps and had no future
desires to have anything to do with them (or rather to risk
putting them on my system). It's hard to shake off such distrust
that has developed over time.

However, since I am familiar with the UNIX type underpinnings of
OS X, I have a bit more confidence in installing these (as well
as other products that I personally view as "high risk" now.
Obviously products can't be made fool-proof (especially if they
are haxies), but it does improve my confidence a bit knowing how
the OS can limit the damage an application can do if it skips out
and runs wild.
 
J

John McGhie [MVP - Word and Word Macintosh]

Welcome to the wonderful world of modern computing :)

On Windows and Mac OS X, there is a single system installer. Other software
vendors are supposed to simply call the system installer. Due to the recent
influx of malware, later Windows versions are much more proactive about
checking that stuff bas been properly installed using the system installer,
and silently rolling it back if it hasn't been.

Mac OS X also has a system installer, and people who provide an installer
for their software would normally use it. But OS X still supports
drag-and-drop installation, which bypasses the installer.

As Jeff pointed out, it's usually worth advising non-technical users to run
the installer if there is one -- it guarantees that any previous versions
are correctly cleaned up and the files are added with the correct
permissions.

For the past 20 years or so, disk errors have been so rare that the
application files are not normally the problem: it's the user preferences
that get corrupted (because after installation, they are the only files that
ever get "written" to).

Neither the Mac nor Windows installers will replace an application file that
already exists. That's why you have to run the "Remove Office" utility.
However, the Windows installer will clean up the "Prefs" (because it checks
and corrects the registry, which is where Windows stores its prefs). The
Mac OS installer simply ignores any Pref file that already exists.

Doing a drag-and-drop "Install" won't fix this, because the Pref files do
not exist on the distribution media. But doing what you recommend,
restoring from a backup, WOULD fix the problem PROVIDED that your backup
included the Pref files. Many people do not include pref files in their
backups, because their content tends to need to be currently consistent with
the rest of the machine environment of the moment. So you can cause more
problems than you cure, if you restore a month-old set of Pref files. Most
Mac Apps will re-create Pref files at firs-run if they do not already exist.

Hope this helps

I guess I don't need to worry about the Unix type of environment with
file permissions - OS is 9.2.2. Also not using shared software on a
network - environment is single user.
Before there were Mac's and then PC's, and third party software like
Word with dozens of files and individual installers putting files who
knows where (although lately there are installer logs), there were OS's
(eg. RT-11) where :

- there was only 1 installer, the system installer
- all programs other than user written ones were part of the OS. If one
got corrupted, such as the text editor, a person simply recopied it (1
file) from the original system disk - it would have been ridiculous to
rerun the OS installer.

I've had 1 situation of Quick Time 3.0 being corrupted every once in
a while on a 7300, OS 7.6._. PictureViewer would hang - replacing it
did nothing - I had to rerun the installer to replace everything.


Unfortunately, yes, althought an end user should not have to be
concerned about the inner workings of a product.
This is like a car manufacturer making a car with a loose timing belt
and expecting the driver every once in a while to have to take out his
wrenches from the trunk toolbox and reset the belt, after the engine
has skipped a tooth forcing the car to die in the middle of the road.


Yes. If I install a new alternator in a car, I definitely would not
want any pieces of the old one remaining, which may be broken, worn
out, or burnt out.

--

Please reply to the newsgroup to maintain the thread. Please do not email
me unless I ask you to.

John McGhie <[email protected]>
Microsoft MVP, Word and Word for Macintosh. Consultant Technical Writer
Sydney, Australia +61 4 1209 1410
 
J

Jeff Wiseman

John said:
As Jeff pointed out, it's usually worth advising non-technical users to run
the installer if there is one -- it guarantees that any previous versions
are correctly cleaned up and the files are added with the correct
permissions.


Lets just say that it "attempts" to guatantee it :)

In any event, using the applications installer/deinstaller tends
to provide the greatest chance of success (unless of course it
was written by Epson :-S )

Doing a drag-and-drop "Install" won't fix this, because the Pref files do
not exist on the distribution media. But doing what you recommend,
restoring from a backup, WOULD fix the problem PROVIDED that your backup
included the Pref files.


AND provided that you do a formal restore from the backup and NOT
just a copy (e.g., a drag and drop) from the backup media when
using any type of unix based file system such as OS X. When you
copy or just drag and drop, the ownership and access permissions
of the final files will usually be those of the person doing the
restore instead of what they were when they were originally
backed up. If the application was originally installed with a
decent installer that had a receipts file for it, running a fix
permissions after restoring the files might correct the ownership
and permission problems (and I emphasize the MIGHT since I have
seen installers that still foul this up). If there was no
reciepts file, then all bets are off.

In the original Mac OS, root = admin = common user. This is no
longer the case in OSX, that is why it is better for security.
However, you now have to account for these differences. Simple
copying or drag and drops between these ownership domains will
NOT usually produce the results that are desired.
 
J

JosypenkoMJ

John said:
restoring from a backup, WOULD fix the problem PROVIDED that your backup
included the Pref files. Many people do not include pref files in their
backups, because their content tends to need to be currently consistent with
the rest of the machine environment of the moment. So you can cause more
problems than you cure, if you restore a month-old set of Pref files. Most
Mac Apps will re-create Pref files at firs-run if they do not already exist.


For everything, my backups are simple - I just copy the whole disk to
another, which takes about an hour. User made stuff I copy at end of
the day.
 
J

JosypenkoMJ

Beth said:
I *loved* OS 9.1 and Office 2001, Jeff. It was simple to adjust memory
settings when/if things bogged down. And it was fast. The only reasons OS
X and Office 2004 are faster for me are that I'm using a G5 machine with 1
GB memory.

In fact, I was the very "last on my block" :) to upgrade to OS X and had to
be dragged kicking and screaming by my fellow MVPs. Just ask any of them.
Believe it or not, I didn't make the transition until just short of a year
ago!

Cheers,

Beth


9.2 is nice. I'm delaying as lonnnng as possible upgrading to OS 10._.
I hate the task bar - now I have to be chasing windows all over the
place ? Why is Apple making the interface look like Windows ? The pre
OS 10 "collapse in place" window shade feature is the way things should
be - let things remain where they are.
 
J

John McGhie [MVP - Word and Word Macintosh]

I'm with you on that one. I didn't like the OS X interface when I first saw
it, and two years later, I still don't like it much.

I haven't tried Tiger: Paul tells me I am not only a luddite but doing
myself a disservice there :)

But then, I do like the Windows XP interface (not as "pretty" as OS X, but a
damned sight faster when you're in a hurry...) so I guess there's no hope
for me :)

Cheers

9.2 is nice. I'm delaying as lonnnng as possible upgrading to OS 10._.
I hate the task bar - now I have to be chasing windows all over the
place ? Why is Apple making the interface look like Windows ? The pre
OS 10 "collapse in place" window shade feature is the way things should
be - let things remain where they are.

--

Please reply to the newsgroup to maintain the thread. Please do not email
me unless I ask you to.

John McGhie <[email protected]>
Microsoft MVP, Word and Word for Macintosh. Consultant Technical Writer
Sydney, Australia +61 4 1209 1410
 
J

Jeff Wiseman

John said:
I haven't tried Tiger: Paul tells me I am not only a luddite but doing
myself a disservice there :)


I'm not sure I agree. I track a lot of the Mac system forums and
newsgroups and Tiger still has some problems. I'd give 'em a
little longer to work out the kinks. That is why I'm still on 10.3.8.

But then, I do like the Windows XP interface (not as "pretty" as OS X, but a
damned sight faster when you're in a hurry...) so I guess there's no hope
for me :)


Actually, I don't think it is so much the Mac OS X GUI as it is
the true multiuser, multitasking, full virtual memory unix like
underpinnings of OS X that give it that sluggishness. I've used X
windows, Solaris, and Openwindows on sun workstations and they
all seem to have that same type of sluggishness. In fact OS X
seems to have the edge in terms of "feel" but that may simply be
because I've only used it on workstations that are faster than
the ones using the other interfaces. A simple process such as
loading an application to run involves a lot of stuff on a
virtual memory system. That's why folks find it is much faster to
just leave all of there important applications running all the
time--then you get (almost) instant gratification when you want
it :)

For that reason, I suspect that it's going to be hard to again
match the great responsiveness and feel of a single user pseudo
multitasking OS such as the original Mac where you could sizzle
through operation after operation and the computer could actually
keep up with you.
 
J

John McGhie [MVP - Word and Word Macintosh]

Hi Jeff:

Is it Sunday already? Must be time to lose the rest of the weekend in a
discussion with Jeff :)

Actually, I don't think it is so much the Mac OS X GUI as it is
the true multiuser, multitasking, full virtual memory unix like
underpinnings of OS X that give it that sluggishness.

OK, there's two issues there. I was talking about the Windows XP GUI. I
find it quicker to get stuff done: it may be ugly, but it's fast. Even if
you know it well, OS X seems to involve more keystrokes and contortions to
do things.

That would be at least in part because I come from a Windows background, so
my computer and folder structures and such are all set up that way. If I
were to take advantage of all the power tools buried in OS X and change my
working style, I suspect I would find less disadvantages in the UI.
I've used X
windows, Solaris, and Openwindows on sun workstations and they
all seem to have that same type of sluggishness. In fact OS X
seems to have the edge in terms of "feel" but that may simply be
because I've only used it on workstations that are faster than
the ones using the other interfaces.

Yes, they do. But I am not sure that's the fault of Unix. I suspect that
part of that is due to application software vendors being unwilling to
re-architect their applications for a multi-threading, multi-tasking
environment.

Certainly it wouldn't be virtual memory that's doing it (unless you happen
to be "using" the virtual memory...) :) We've had virtual memory since
the days of DOS and Mac OS 7 or whatever.

Virtual memory simply assigns a portion of the disk to impersonate "real"
memory. If the user runs more applications than he has memory for, the
system "pages" the content of one or more chunks of an existing application
out to virtual memory to make room. If that happens much, the system gets
seriously slow. Anyone with a gig of real memory in OS X (or Windows) will
never hit virtual memory, PROVIDED they quit each application when they
finish with it.

If they adopt the old Mac user's operating method of leaving everything
running between uses, then they WILL be hitting virtual memory and their
system will be slow :) I can't seem to get this through to Windows XP
users either: if you don't quit stuff you are not using, eventually your
system starts flogging the disk looking for memory and everything gets
veeerrry s l o o o o o w w w w . . . .

"Multiuser" doesn't really enter into the performance discussion. Really,
it's an accounting mechanism. It determines who is allowed to do what, and
where to send the bill for the services consumed. But it's the same
computer with the same memory and same disk doing the work, who gets the
bill has no effect on the throughput of the machine. On a desktop computer,
it's not really a consideration, because there is normally only one user
active (plus the system). On a home computer, you may get the situation
where Dad's email slows to a crawl because teenage daughter has downloaded
an entire music album and switched users so Dad doesn't know the download is
running. But such performance problems can usually be solved by sending Dad
to watch the ball game...

Now: Multitasking is a different issue. To begin with, no affordable
computer can do it, and never has been able to :) Multitasking literally
means that the computer can process work from more than one task
simultaneously. How? It only HAS one CPU!! It can only ever do one thing
at a time :) What the computer industry laughingly calls "multitasking" in
fact describes a computer's ability to task-switch very rapidly. Computers
just hitting the streets now have two CPUs, and could IN THEORY do two
things at once. Intel's "Hyperthreading" technology improves a computer's
ability to switch tasks fast by having two parallel sets of instruction
decoding pipelines to enable the computer to decode the instructions for the
next steps on one thread while executing he steps of the current thread.
But you still only get one task per CPU :)

Multitasking thus notionally means the computer appears to run SLOWER. In
theory, it is a "not good thing". On the face of it, if you are running one
application, the computer is giving that application 100 per cent of its
capability. If you are running ten applications, each application gets only
one tenth of the computer's power, and the user notices that the application
he is working in actually goes ten times slower in a multitasking operating
system.

So computer companies are lying in their back teeth. They can't multi-task,
and never have been able to. But they can "pretend" to. In practice, this
is where a lot of the really black arts of computer science are practiced.
I am old enough to have begun computing with mainframes, and they have
always had some form of multitasking. A considerable part of the system
administrator's working life was spend adjusting the priorities and resource
allocations of programs to run the machine as efficiently as possible.
Still is, on a big mainframe.

And this "system tuning" is still the area that really determines the
quality of the user experience on a multitasking desktop computer. I know
that Microsoft put an enormous amount of energy into improving the
user-responsiveness of Windows -- I was on the beta test for Windows NT 4,
2000, and XP. There are two key things they can play with: the "minimum
time slice" and the "thread priority". Your typical G5 Mac is peddling
along at two thousand million instructions per second. Unix has a bad habit
of defaulting to a 20 milli-second time-slice. Keeping the maths simple,
every application gets handed 40,000,000 instructions worth of processing
each time its request is answered. The OS won't look again, or give time to
anyone else, until 200 milliseconds go by. Now, that kind of allocation is
fine if you want to typeset the Gettysberg Address. It was fin back in the
days when CPUs ran at eight million instructions per second. It might even
work today on a mainframe processing a large batch job. But to decide that
the user has pressed the letter "b" and send the content of the keyboard
buffer to Word? Gimme a break!

So the first thing you can do to make an OS seem responsive is to cut those
time slices way back. Windows XP uses really short time slices, and thus
hands out a lot more of them. No application ever has to wait "long" to get
the computer's attention.

Then we can start playing around with Thread Priority. The way thread
priority works is that the OS hands out a hundred slices to the "High"
priority tasks, then 10 to the "medium" priority tasks, and then 1 to the
"low" priority tasks. The task waiting longest gets the slice handed out
next in each category (there are actually 255 priorities, but you get the
idea). Both Unix and Windows enable an application to adjust their
priorities. If the application has nothing to do, when it next gets a time
slice it responds with "Nothing to do and set my priority to 'low'". If
Acrobat Reader was sitting in the background it would set its priority to
'low". Then you open a PDF. Reader then tells the OS on the next time
slice "Really busy, jump me to High and call the ATSUI rendering engine for
me." Adjusting thread priorities is an art, not a science :) If an
application designer gets his thread priority too low, his application is
unresponsive. If he gets it too high, his application slows the whole
system down and he loses sales. In between, there's a sweet spot that seems
responsive to the user, without draining too much system resource or
interfering with other applications. Unfortunately, the sweet spot depends
on what else is installed and running at the time. And the application
designer can't know that...

A way around that is to design the application for multithreading. Nearly
all of the applications that existed before OS X and Windows NT were
"single-threaded". Their code was one large single piece. There was no
point in splitting them up. The application would start, do what it had to
do, then exit. Only one application could be running at a time, and it
would exit when it was finished. So there was no performance benefit in
splitting it into multiple pieces: quite the reverse.

Now, there is a benefit in doing that. You put the tasks the user will want
to do frequently in small modules that can exit quickly. You run these
modules at high priority. The system seems responsive to the user. You put
the things the user does not need to be involved with into different pieces,
and run them at a lower priority. Since the user never needs to be
involved, he doesn't know how long they take and doesn't care. Best of both
worlds: the system seems far more responsive and the computer runs very
efficiently.

But splitting an application up is horrendously complex and difficult to
test, because all of the pieces may depend on work being done by another
piece. So application designers won't attempt it unless they can begin
their design that way.

Most "modern" applications have never gotten beyond "idle loop processing".
This is a sort of "cheat" multi-tasking. You take all of the bits that
interact with the user and put them in the Main thread. You take everything
else (all the bits that actually perform work) and place them into the Idle
Loop. The main Thread runs very frequently to check if the user wants to do
something, then falls quiet and calls its Idle Loop. Word uses this
technique. While you are typing, or moving the mouse, Word is doing nothing
but listen to you. When you stop typing or stop moving the mouse, that's
when Word begins to process the information you have given it. Pagination,
spelling, grammar, printing: all of those things happen only when you stop
to think. This technique benefits the whole system: if everyone builds
their applications this way, the system is much more responsive, and the
"work", the actual "processing" gets done only when the user doesn't want
the system for something else.

But it also leads to some peculiar effects. The "Beachball" is one. Word
calls a high-priority interrupt when it needs to display what has happened
as a result of something you have done. But it then needs to sit there and
wait until the background task (the idle loop) gets enough time in the CPU
to complete making up the thing to be displayed. You and I get to watch the
beachball until that happens. Cunning old farts like me know that Word is
waiting for the idle loop to get some time. We know it will get some time
right after the Main loop completes each time. And we know that if we click
the mouse button, the main loop will be called. So if you want the
beachball to stop and show you what you typed, click the mouse :)
For that reason, I suspect that it's going to be hard to again
match the great responsiveness and feel of a single user pseudo
multitasking OS such as the original Mac where you could sizzle
through operation after operation and the computer could actually
keep up with you.

No. But it is a LOT of laborious detailed work, splitting applications up
into multiple threads, adjusting the priorities of each, tweaking their
priorities, coding, and testing. It doesn't add ANY "new features" that you
could give to Marketing to SELL. But it has a very high chance of adding
LOTS of lovely new BUGS -- timing issues where the justification routine
crashes because it was waiting for the pagination routine which was waiting
for the printing routine when all three were interrupted by the user hitting
the delete key...

This is not something software vendors are actually "enthusiastic" about.
The move to MacIntel may assist them to learn to love it :) Not that
anyone in the computer industry has EVER run out of excuses or the ability
to point the finger somewhere else to blame someone else.

But they will need to be a little more inventive to come up with a
convincing excuse when we point out that "your application runs ten times
faster on Mac OS than it does on Windows OS using the same computer to
process the same file." Might have to fix it then...

Cheers

--

Please reply to the newsgroup to maintain the thread. Please do not email
me unless I ask you to.

John McGhie <[email protected]>
Microsoft MVP, Word and Word for Macintosh. Consultant Technical Writer
Sydney, Australia +61 4 1209 1410
 
E

Elliott Roper

John McGhie [MVP - Word said:
Hi Jeff:

Is it Sunday already? Must be time to lose the rest of the weekend in a
discussion with Jeff :)

...and a fine discussion it is too. May I join in?
On 16/9/05 12:15 AM, in article #[email protected], "Jeff


Yes, they do. But I am not sure that's the fault of Unix. I suspect that
part of that is due to application software vendors being unwilling to
re-architect their applications for a multi-threading, multi-tasking
environment.

Unix has a bad habit
of defaulting to a 20 milli-second time-slice. Keeping the maths simple,
every application gets handed 40,000,000 instructions worth of processing
each time its request is answered. The OS won't look again, or give time to
anyone else, until 200 milliseconds go by. Now, that kind of allocation is
fine if you want to typeset the Gettysberg Address. It was fin back in the
days when CPUs ran at eight million instructions per second. It might even
work today on a mainframe processing a large batch job. But to decide that
the user has pressed the letter "b" and send the content of the keyboard
buffer to Word? Gimme a break!

So the first thing you can do to make an OS seem responsive is to cut those
time slices way back. Windows XP uses really short time slices, and thus
hands out a lot more of them. No application ever has to wait "long" to get
the computer's attention.

You know very well that is not the whole story. Both Unix and NT will
consider rescheduling threads on every significant interrupt, only one
of which is the clock tick that expires a time slice interval. I/O
completing earlier in another thread does just as well. Both NT and OS
X do their scheduling via 'kernel threads'. The time slicing comes into
play only when nothing esle is happening and more than one
compute-bound thread is competing for the processor's attention.
Then we can start playing around with Thread Priority. The way thread
priority works is that the OS hands out a hundred slices to the "High"
priority tasks, then 10 to the "medium" priority tasks, and then 1 to the
"low" priority tasks. The task waiting longest gets the slice handed out
next in each category (there are actually 255 priorities, but you get the
idea). Both Unix and Windows enable an application to adjust their
priorities. If the application has nothing to do, when it next gets a time
slice it responds with "Nothing to do and set my priority to 'low'". If
Acrobat Reader was sitting in the background it would set its priority to
'low". Then you open a PDF. Reader then tells the OS on the next time
slice "Really busy, jump me to High and call the ATSUI rendering engine for
me." Adjusting thread priorities is an art, not a science :) If an
application designer gets his thread priority too low, his application is
unresponsive. If he gets it too high, his application slows the whole
system down and he loses sales. In between, there's a sweet spot that seems
responsive to the user, without draining too much system resource or
interfering with other applications. Unfortunately, the sweet spot depends
on what else is installed and running at the time. And the application
designer can't know that...
... of course it simpler than that for the administrator of the machine,
who does not need or even get too many tools to play with. (nice and
renice) Threads are scheduled for execution strictly by priority, but
the scheduler plays with compute intensive thread priority on-the-fly
within a 'round-robin-range' to deal out resource fairly.
A way around that is to design the application for multithreading. Nearly
all of the applications that existed before OS X and Windows NT were
"single-threaded". Their code was one large single piece. There was no
point in splitting them up. The application would start, do what it had to
do, then exit. Only one application could be running at a time, and it
would exit when it was finished. So there was no performance benefit in
splitting it into multiple pieces: quite the reverse.

Now, there is a benefit in doing that. You put the tasks the user will want
to do frequently in small modules that can exit quickly. You run these
modules at high priority. The system seems responsive to the user. You put
the things the user does not need to be involved with into different pieces,
and run them at a lower priority. Since the user never needs to be
involved, he doesn't know how long they take and doesn't care. Best of both
worlds: the system seems far more responsive and the computer runs very
efficiently.
I think you are comparing pre-emptive and co-operative scheduling. In
OS9 and earlier, if a compute bound process hogged the machine, nothing
else would be scheduled until it gave the OS permission to do so.
Typical might be a four hour video render that would not even allow the
menubar clock to advance, let alone permit the user to join in debate
with John and Jeff on usenet. That was co-operative scheduling. When
done well by every program in the machine it was very very good, but
when one went bad, it was horrid.

Timeslicing is old-style unix's method of permitting many processes to
pretend to execute at once. You do get something for nothing. When one
slice is waiting for I/O, another can run before the first's slice
expires. With old-style Mac OS, the unco-operative process was
permitted to sit there going 'la-la-la I can't hear' you till the disk
finished getting his stuff. That is simple pre-emptive scheduling. The
timeslice expiry cut Mr La-la off at the pass.

With kernel thread scheduling, as in NT and OS X, the author of an
individual process gets the option of writing a mini-operating sytem
just for his own program. Like multiply buffering asynchronous I/O, so
he can get some of his own work done while waiting for his own I/O, as
well as responding quickly to user interaction interrupts, like
keystrokes and mouse moves and clicks.
But splitting an application up is horrendously complex and difficult to
test, because all of the pieces may depend on work being done by another
piece. So application designers won't attempt it unless they can begin
their design that way. So true.

Most "modern" applications have never gotten beyond "idle loop processing".
This is a sort of "cheat" multi-tasking. You take all of the bits that
interact with the user and put them in the Main thread. You take everything
else (all the bits that actually perform work) and place them into the Idle
Loop. The main Thread runs very frequently to check if the user wants to do
something, then falls quiet and calls its Idle Loop. Word uses this
technique. While you are typing, or moving the mouse, Word is doing nothing
but listen to you. When you stop typing or stop moving the mouse, that's
when Word begins to process the information you have given it. Pagination,
spelling, grammar, printing: all of those things happen only when you stop
to think. This technique benefits the whole system: if everyone builds
their applications this way, the system is much more responsive, and the
"work", the actual "processing" gets done only when the user doesn't want
the system for something else.
Yep, what you describe is the GUI curse. Horrid, horrid callback land.
But it also leads to some peculiar effects. The "Beachball" is one. Word
calls a high-priority interrupt when it needs to display what has happened
as a result of something you have done. But it then needs to sit there and
wait until the background task (the idle loop) gets enough time in the CPU
to complete making up the thing to be displayed. You and I get to watch the
beachball until that happens. Cunning old farts like me know that Word is
waiting for the idle loop to get some time. We know it will get some time
right after the Main loop completes each time. And we know that if we click
the mouse button, the main loop will be called. So if you want the
beachball to stop and show you what you typed, click the mouse :)

...if you are lucky. You at the mercy of the application's designer for
that to happen.
You had a better experience of that than I remember. I hated it.
No. But it is a LOT of laborious detailed work, splitting applications up
into multiple threads, adjusting the priorities of each, tweaking their
priorities, coding, and testing. It doesn't add ANY "new features" that you
could give to Marketing to SELL. But it has a very high chance of adding
LOTS of lovely new BUGS -- timing issues where the justification routine
crashes because it was waiting for the pagination routine which was waiting
for the printing routine when all three were interrupted by the user hitting
the delete key...

Nah. All you have to do is get out of the callback loop mindset. Done
right, it is easier than before. Done wrong, it is a similar disaster.
A great example of wrong is Tiger's finder spotlight search. Boy, is
*that* a 'work-in-progress'. Or Tiger's mail. They have introduced an
interesting race between the threads that list mailbox content. Scroll
the mailbox list too fast and the messages appear as if in the wrong
mailbox.

Hmm. I have made your point for you, haven't I? Done right, threaded
applications are easy. Honest! (I was brought up properly in VMS and
RSX before that, where asynch queued I/O with AST's (Asynchronous
System Traps) was the norm. Even there, we got the same callback
nightmares with the X11 Windowing system.)
This is not something software vendors are actually "enthusiastic" about.
The move to MacIntel may assist them to learn to love it :) Not that
anyone in the computer industry has EVER run out of excuses or the ability
to point the finger somewhere else to blame someone else.

But they will need to be a little more inventive to come up with a
convincing excuse when we point out that "your application runs ten times
faster on Mac OS than it does on Windows OS using the same computer to
process the same file." Might have to fix it then...

That is one competition I'm watching with interest. Much as I like OS
X, I think the NT kernel's scheduler will be hard to beat. I will be
very surprised if OS X is faster on identical machines in that
department.
 
J

Jeff Wiseman

John said:
Hi Jeff:

Is it Sunday already? Must be time to lose the rest of the weekend in a
discussion with Jeff :)


I appear to be getting a "reputation" in the group. I hate it
when that happens (thanks for changing the subject line).

Enjoy it while you can though, it appears that I have finally
gotten myself my first real full time job in a while so I may not
have as much opportunity to be online as much.

As long as I am entertaining and/or informative though, I guess I
should hang around...

:)

OK, there's two issues there. I was talking about the Windows XP GUI. I
find it quicker to get stuff done: it may be ugly, but it's fast. Even if
you know it well, OS X seems to involve more keystrokes and contortions to
do things.

That would be at least in part because I come from a Windows background, so
my computer and folder structures and such are all set up that way. If I
were to take advantage of all the power tools buried in OS X and change my
working style, I suspect I would find less disadvantages in the UI.


I suspect that what you say is very true. The disktop pardigm as
implemented over Darwin is going to require some years to
"evolve" to become as effective for many applications (i.e. how
people do their work) as the traditional Windows UI was. A lot of
this may also be due to the Windows UI having so many different
ways to do the same thing (unfortunately each with its own
caveats), everyone can eventually develop their own desktop
processes.

Yes, they do. But I am not sure that's the fault of Unix. I suspect that
part of that is due to application software vendors being unwilling to
re-architect their applications for a multi-threading, multi-tasking
environment.


This is certainly part of it. E.g., *NO* application running on
any Unix type subsystem should *EVER* sit in a polling loop
(i.e., the very way that Word functions). It should be blocked on
input in an event driven style. Otherwise it is chewing up and
wasting CPU cycles. It also forces the computer to leave RAM for
the virtual memory assigned to that process to also be tied up.
This has always been extremely bad programming style IMHO.

I guess one of the most significant issues that I was mainly
alluding to is the time to load an application (or some of its
processes if it is a multi-process type application) and start
it/them running. The necessary evil of a full-blown virtual
memory, multi-tasking system is that the loading and initiating
of processes tends to be more complex.

Note that the multi-tasking and virtual memory mechanisms of both
the old Mac OX and Windows are really very simplistic compared to
the way it is done in UNIX. More on this below.

Certainly it wouldn't be virtual memory that's doing it (unless you happen
to be "using" the virtual memory...) :) We've had virtual memory since
the days of DOS and Mac OS 7 or whatever.


Again, this is true but their implementations are fairly
simplistic. Loading an application into memory now requires a
view of segmentation and such maintained by the processors and
the OS. From what I understand, this involves more overhead in a
full blown fully virtual memory system like UNIX.

This is why in OS X (or other Unix based systems) an application
that is loaded but not being used (i.e., hasn't required any CPU
cycles for a while) can eventually swap totally out to disk and
basically will not use any of the computer resources except disk
space but can be recalled almost instantly when called upon to
run again. This is good in that if you have the disk space, you
can have literally thousands of tasks in the running but blocked
mode (i.e., waiting for input events) and the system can be just
as responsive as if only a couple of active tasks were running.

However, when a task is blocked waiting on input and has been
swapped out, there can appear to be delays as it swaps back in to
process its events. This can appear as a sluggishness to respond
to UI input. In the original Mac OS (and I believe Windows as
well), part of the application that handles input is left in RAM
space in order to avoid the delays. However, that means you can't
use that RAM space for anything else until you quit the
application. This is why leaving unused applications running on
the old Mac OS or in Windows is a bad idea whereas leaving unused
applications running on a Unix system doesn't really hurt
anything and in fact can be good since the next time it is
required to run, it is available nearly instantly.

My point was basically that simplistic VM typically was built in
a way that full swapouts didn't normally occur because it was so
difficult to reload them fast for certain types of input (usually
because the pieces being swapped in and out were monolithic--huge
chunks decided upon by the developer). The Unix type VM swaps in
many tiny pieces and is transparent to the programmer. This has
the advantage that the entire application can be swapped out if
necessary, or only enough pieces to allow what ever other
processes need to be running. The down side is that there are:

1) potential delays swapping an app back in to handle its UI
input and:

2) Extra complexity (i.e., delays) loading an application the
first time.

Virtual memory simply assigns a portion of the disk to impersonate "real"
memory. If the user runs more applications than he has memory for, the
system "pages" the content of one or more chunks of an existing application
out to virtual memory to make room. If that happens much, the system gets
seriously slow. Anyone with a gig of real memory in OS X (or Windows) will
never hit virtual memory, PROVIDED they quit each application when they
finish with it.


All true. However my point was that with a system that has
defined very small "chunks" as swappable (e.g., UNIX), pieces
that are rarely used get swapped out first so swapping activity
slowdowns are not as severe in general. The RAM use becomes far
more effective. A skilled programmer can design an application to
function effectively with this in both the UNIX or the older
Windows, Mac OS environments. However, a very bad programmer can
not cause as much trouble through his application design on a
UNIX system as he can on the older personl OS type systems

If they adopt the old Mac user's operating method of leaving everything
running between uses, then they WILL be hitting virtual memory and their
system will be slow :) I can't seem to get this through to Windows XP
users either: if you don't quit stuff you are not using, eventually your
system starts flogging the disk looking for memory and everything gets
veeerrry s l o o o o o w w w w . . . .


Exactly. But on something like UNIX, although slowdown will
occur, you won't get the extremes unless many processes are
written poorly (e.g., polled inputs like Word uses) and try to
all lock themselves into memory.

"Multiuser" doesn't really enter into the performance discussion. Really,
it's an accounting mechanism. It determines who is allowed to do what, and
where to send the bill for the services consumed. But it's the same


To support a very effective multiuser system, there will be other
overheads added. If not anything else there are more file trees
to navigate and added over head and security processes running on
the system (e.g. group, world, owner type ownership checks,
etc.). I have no idea how significant the extra overhead this
adds--it may be trivial--but I think it is finite so that is why
I mentioned it.

BTW, a good multiUSER system (as opposed to mutli tasking) is
going to quarantee that a single user can't take over the system.
If a user starts a CPU intensive task (such as compiling
aprogram, etc.), the OS is not going to allow that to go very far
before yanking the cpu away from that task in some fashion. UNIX
as a very effective way of dynamically assigning task prioritys
to do these that has been more or less in places for many decades
now.

The analogy is that Word is like a bacteria in a healthy body
when it is run over Darwin. The prioritizing algorithm of Darwin
sees this process that wants to just keep running because of its
silly "idle loop" design, and so it will attempt to protect the
rest of the system by dropping Word's run time priority in stages.

Now: Multitasking is a different issue. To begin with, no affordable
computer can do it, and never has been able to :) Multitasking literally
means that the computer can process work from more than one task


Of course, I was referring to virtual tasks and the time
sharing/time slicing mechanisms of the OS. So within the context
of my original comments all PCs/Macs are multi-tasking. Some are
orders of magnitudes above the others in their capabilities.

simultaneously. How? It only HAS one CPU!! It can only ever do one thing
at a time :) What the computer industry laughingly calls "multitasking" in
fact describes a computer's ability to task-switch very rapidly. Computers


I will forego a sudden impulse to discuss the incremental theory
of time and some of Dr. Who's exploits, regardless of how
relevent they may be here :)


<<great discussion on time slicing, multithreading, and
prioritizing deleted>>
But splitting an application up is horrendously complex and difficult to
test, because all of the pieces may depend on work being done by another
piece. So application designers won't attempt it unless they can begin
their design that way.


Exactly. As you pointed out, with the original PC and Mac systems
it was, in fact, up to the application designers to cooperate
with each other to "tune" all these things. These is NOT a
cohesive way of controlling system level operations. A system's
performance should not depend on an application's design. System
tuning and control must be available at the system level. I.e.,
nearly all system tuning must be possible via the OS itself and
not depend on how smart a particual application's developer was.
UNIX (i.e., Darwin) does this. PCs and the original MAC OS really
don't and require the cooperation and skills of the developers
following a certain set of standards and phylosophies in order to
achieve a responsive system. In Unix, if a process tries to hog
the system, the OS will take away it's ability to eat CPU cycles
at a high rate.

Most "modern" applications have never gotten beyond "idle loop processing".
This is a sort of "cheat" multi-tasking. You take all of the bits that
interact with the user and put them in the Main thread. You take everything
else (all the bits that actually perform work) and place them into the Idle
Loop. The main Thread runs very frequently to check if the user wants to do
something, then falls quiet and calls its Idle Loop. Word uses this
technique.


The issue is that there should always only be only one "idle
loop" in the entire system. Everything else should be event
driven. In Unix, that idle loop (i.e., the scheduler) is in the
OS kernel. Word wants to be the entire OS which is the wrong
approach. Unfortunately, because of it's massive legacy and the
fact that this is an infrastructure issue, it is very unlikely
that this will ever be corrected. However, I believe that there
may be hope in OS X since the OS provides enough tools to make
such a change easy. The problem is that if MS took advantage of
this, their OS X product(s) would then have infrastructures
significantly different than their PC products so again, this is
unlikley to happen.

No. But it is a LOT of laborious detailed work, splitting applications up
into multiple threads, adjusting the priorities of each, tweaking their
priorities, coding, and testing. It doesn't add ANY "new features" that you
could give to Marketing to SELL.


Again, I think that the development issue that you present,
although entirely valid on a PC or even a classic Mac OS, doesn't
apply as well on a UNIX based system such as the Mac OS X IMHO. I
suspect it would still be a massive restructuring effort to
convert it to an event driven interface but it would result in a
significantly smaller application since many of the tests and
gotchas currently in the product would not be necessary

This is not something software vendors are actually "enthusiastic" about.
The move to MacIntel may assist them to learn to love it :)


Unfortunately, I believe the issue is the OS and not the hardware
it runs on. Atlhough the exact way the VM works is dependant on
the processor's capabilities, with few exceptions multitasking is
a software bound function. Darwin has the capabilities but the
infrastructure of Word assumes a totally different environment
where it is so mated to that environment that environment
specific issues saturate the innards of Word.

In summary, I do think that the characteristics that make the
Unix subsystem of OS X as great as it is will also contribute a
bit to a sluggishness that earlier systems didn't have. A lot of
that can be reduced by system tuning and applications eveoling to
use better design techniques. Applications that doen't follow
good programming rules will impact the system some, but not like
they used to in the old style OSs.
 
J

Jeff Wiseman

Elliott said:
..and a fine discussion it is too. May I join in?


You're not allowed to trash his Monday though...

You had a better experience of that than I remember. I hated it.


Actually, come to think of it, I never liked running more than
one task at a time (i.e., I didn't even have the multifinder
installed for the longest time). I would simply kill one task and
start another. It was really fast so I guess I liked the
simplicity and the absence of out-of-memory errors. I guess that
my multitasking usage was so limited on the original mac OS that
I really shouldn't be making comments on it :)

Ahhh, the simple life...

That is one competition I'm watching with interest. Much as I like OS
X, I think the NT kernel's scheduler will be hard to beat. I will be
very surprised if OS X is faster on identical machines in that
department.


I too suspect that you are right. However, I would rather have a
system with a good orthogonal security system with a stable and
generic type infrastructure. As long as my speeds are
"reasonable" I will give up a lot of zip anyday for an OS that
doesn't have to have infrastructure changes and fixes everytime
someone writes a new application for it revealing some yet unseen
loophole in the OS.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top