Why are copies of MDE FE's considered best Access 2002 practice?

  • Thread starter Stephen J. Levine MD
  • Start date
S

Stephen J. Levine MD

I am now starting to question why giving each client a
copy of an mde front end is considered best practice in a
multi-user environment using Access 2002.

Understand that I am not questioning the best practice of
splitting the database. I support this practice in part
because I believe that, long term, the database itself in
a multiuser environment should be placed on Oracle or SQL
Server as I do not feel JET is robust enough for this
deployment. I understand that Microsoft appears to be
heading toward using Access as a front end for SQL Server.

There appears to be very little in the literature to
support individual front end copies on the client as best
practice, other than a statement on an Microsoft Access
2003 document cited on other threads pertaining to this
topic and strong statements by some on this Newsgroup
that violating this practice is tantamount to
incompetence. When the discussion goes to the logical
corrollary question as to whether each instance on a
client needs to use its own copy of the fe, the responses
generally have been weak, derogating the questioner for
suggesting the need for multiple instances rather than
addressing the question directly. This so far is
implying that the data supporting this practice with
regard to Access 2002 may not be good, if it even exists.

For example, with my own application, all session-
specific data is stored in public and private variables
and arrays rather than in temporary tables or database
properties. Static settings, including the menubar, are
stored as properties of the database.

I understand that startup properties are static
attributes of the database and not dynamic attributes of
the session, and thus cannot be altered in a multiuser
environment where a single fe is used. This became
apparent as I was developing a subsidiary application
where I wanted to change the menubar based on security.
That approach has been abandoned.

The only personal experience that I had supporting
separate fe's for each client was that the criteria for
one of my record lookups using a form called Event became
a permanent value of that form's filter property, thus
raising the question as to whether this property was
handled during sessions as a property of the form and
thus raising the risk that this value in one session
could be altered by a simultaneous session using the same
fe file.

To attempt an answer of this question, I performed this
test, using the same access front end, in this case
an .mdb for convenience to see if two instances of this
front end could interfere with each other:

At start of test, in dbwindow of each instance before
launching form in either instance:
Instance 1:forms!event.filter = "Etracking = 127"
Instance 2:forms!event.filter = "Etracking = 127"

In instance 1, look up event form record where etracking
= 128. Then examine values of forms!event.filter.value
in design view in both instances:
Instance 1:forms!event.filter = "Etracking = 128"
Instance 2:forms!event.filter = "Etracking = 127"

Exit back to dbwindow view in Instance 1. In instance 2,
look up event form record where etracking = 130. Then
examine values of forms!event.value in design view in
both instances:
Instance 1:forms!event.filter = "Etracking = 127"
Instance 2:forms!event.filter = "Etracking = 130"

Exit back to dbwindow view in Instance 2. Then examine
values of forms!event.filter.value in design view in both
instances:
Instance 1:forms!event.filter = "Etracking = 127"
Instance 2:forms!event.filter = "Etracking = 127"

Conclusions: Lookups of different records in two
different instances of the same access front end do not
interfere with each other.

I suspect that the forms!event.value of "Etracking = 127"
was set in the past at some time during development when
design changes were saved in the form on exit. As this
form is no longer being changed in this version this is
no longer at risk of change, and, in future versions,
before implementation, this property will be cleared.
What is very reassuring is the warning, each time I went
into design mode to look up the value of this form's
property, I received the warning that the file was not
opened exclusively and thus no changes would be saved.

While this little exercise does not debunk the best
practices standard of a separate mde fe copy for each
client, it does call it into question. Because this best
practices standard seems at variance with what would be
considered best electronic version control practice,
which would be a single copy in one place, I need
documentation further supporting or refuting this
standard. Because the standard, as it stands, at our
institution, would mean separate copies for each user
instance on a client, we really need to know more about
the reasoning behind it and how it would apply to this
application considering it was coded to allow multiple
instances through use of dimensioned variables and arrays
for session-dependent data.

sjl
 
T

Tony Toews

Stephen J. Levine MD said:
I am now starting to question why giving each client a
copy of an mde front end is considered best practice in a
multi-user environment using Access 2002.

Sharing copies of the FE lead to a greatly increased chance of
corruption in A2000 and newer. I've had clients where the only
corruption happened in the FE because, somehow, a volunteer had
managed to copy the MDE FEs to the server and share them.

It's also much more difficult to deploy updates in this fashion. You
have to wait until all the users log out before you can put a new copy
of the FE in place.

Standard blurb follows. <smile>

I specifically created the Auto FE Updater utility so that I could
make changes to the FE MDE as often as I wanted and be quite confident
that the next time someone went to run the app that it would pull in
the latest version. For more info on the errors or the Auto FE
Updater utility see the free Auto FE Updater utility at
http://www.granite.ab.ca/access/autofe.htm at my website to keep the
FE on each PC up to date.
Understand that I am not questioning the best practice of
splitting the database. I support this practice in part
because I believe that, long term, the database itself in
a multiuser environment should be placed on Oracle or SQL
Server as I do not feel JET is robust enough for this
deployment.

Depends on the stability of the hardware. I have a client with 20-30
users in all day long on a 300 Mb BE with a corruption every six
months or so.
I understand that Microsoft appears to be
heading toward using Access as a front end for SQL Server.

<shrug> If anything I would've said ASP.NET, VB.NET, C#.NET or
similar. Mind you from what I've read development time is
significantly higher in these systems. Partially because the Access
IDE has so many features. Takes me a lot longer to do things on forms
in VB6.

Tony
--
Tony Toews, Microsoft Access MVP
Please respond only in the newsgroups so that others can
read the entire thread of messages.
Microsoft Access Links, Hints, Tips & Accounting Systems at
http://www.granite.ab.ca/accsmstr.htm
 
S

Stephen J. Levine MD

What is the nature of the corruption that you are seeing?
I have been meaning to ask you that.

Also, what has your experience been with multiple access
sessions with the same FE?

sjl
 
V

Van T. Dinh

In addition to Tony's, if there is any problem with a user's copy of the
Front-End, only 1 user is affected by the problem, not all users as it would
happen in a shared Front-End set up. I was recently asked by a new client
to fix a database urgently because of a corrupted shared Front-End since no
one could use the database.
 
A

Arvin Meyer

Stephen J. Levine MD said:
What is the nature of the corruption that you are seeing?
I have been meaning to ask you that.

Also, what has your experience been with multiple access
sessions with the same FE?

Very simply, sending a front-end over the network, increases the number of
packets send and received. The more packets moved, the more data collisions,
and the greater the chance of a malformed packet. If a front-end file gets
corrupted, or bloated, or anything happens to it, all one needs do is
download a new copy from the server.

Most corruption occurs in OLE fields (Memo, hyperlink, bound and unbound OLE
fields) because the data is not stored on the same page. Pointers are used
to write a disk address to the table. When any other index is corrupted,
it's relatively easy to rewrite. (That's part of what compacting does). Not
so with pointers. They are only easily rewritten if they were correct in the
first place. If the disk address cannot be found, or it doesn't match the
record in the pointer, the row or page is marked as corrupted.

Additionally, it's an incredible waste of network resources.

I have had very little corruption in my databases over the past 11+ years.
All of it has been directly traceable to hardware failure or user error. I
have databases that have gone for years on end without experiencing
corruption. Almost all the corruption I've ever gotten during development
has been running the database from the server, even when there has been no
other users. I now ignore any requirement to run a front-end of any kind
(other than ASP) over a server. I don't know whether you've noticed it or
not, but no web application runs as flawlessly as applications which run
from a workstation. Adding complexity always ensures less reliability.
--
Arvin Meyer, MCP, MVP
Microsoft Access
Free Access downloads:
http://www.datastrat.com
http://www.mvps.org/access
 
S

Stephen J. Levine MD

Arvin

Thanks for your reply.

Interestingly enough, we use our network a lot. I and
others keep our documents and other files in our network
folder, which we are each assigned. This way we can take
advantage of the nightly backup.

Like you, I had an access application run for 2-3 years
without corruption in the database. It was in a critical
area, was not split, but did not have more than 1 user at
a time. We do not use it now because we changed our
method of doing this particular lab test.

The only other question I have is that you mention that,
corruption occurs in ole fields and indexes. These would
be expected to be in the back end, rather than the front
end. Understanding that you generally do not create front
ends, I am mainly posing the following question for client
fe copy advocates:

If data corruption occurs in ole fields and in indexes, if
there are none of these in a front end, how does having
the front end on the client prevent these?

Thanks to all who are bearing with me on this thread. I
am learning a lot from it.

sjl
 
G

Guest

Van

Do you know what the nature of the corruption in the front
end was?

Another issue that one could raise about individual copies
of the front end on PC's would be the issue of, in a
pharmaceutical setting, the need for each to be
validated. I wonder if anyone from the pharmaceutical
industry is on this Newsgroup and would care to comment.

sjl
 
S

Stephen J. Levine MD

My apologies about this post appearing as anonymous.

I usually write my reply and then put the sender
information in. In this case, I forgot to put the sender
information in before sending.

My goof
sjl
 
A

Albert D. Kallal

The only other question I have is that you mention that,
corruption occurs in ole fields and indexes. These would
be expected to be in the back end, rather than the front
end.

Yes, that is correct.
If data corruption occurs in ole fields and in indexes, if
there are none of these in a front end, how does having
the front end on the client prevent these?

Well, the front end often will corrupt also. And, then uses trying to launch
the application will get a error like the file you are attempting to use is
NOT a ms-access file. So, sure, there is nothing here being said that the
front end will not get damaged. However, the back end can also be damaged in
a multi-user environment.

Further, you are reducing network load, and reducing network collisions and
the chances of errors on the network is a good thing. This reduction in load
obviously applies to the back end also.

Less traffic = less work = less problems

The concept of reducing network traffic and thus increasing reliability
should not be a surprise to anyone in the computer industry. I am not sure
why you don't think reducing network traffic would not increase reliability?
Why should this idea be a surprise to you?

In addition, there is also the issue of temporary tables, and temporary work
items that each user needs. Normally these temp items are not a problem for
conflicts, but often they are (and for sure, with increased users =
increased chance of problems). When you create a query on the fly, some
information is often stored and creating in temp work items. Further, even
things like the position of the form on your screen is saved (if you remove
auto center option, the position of the form can be saved if you whack
ctrl-s for example. All this stuff gets stored somewhere). You can even look
at some of this temp stuff if you want. Go to Tools->options->View tab. Now
check the show Hidden objects, and also the show system objects.

Now, go back to the table view. You will see a bunch of additional tables.
(by the way, this also means you can hide some of your tables also!). Take a
look at some of those tables and poke around in them (you will be amazed
that the junk in their!). "MSysQueris" for example has all kinds of queries
info.

So, you get a bunch of people hitting the same tables, and you simply
increase the chances of damage. This chances of damage applies to both the
front end, and the back end. However, it rather simple to figure out that
less network traffic = less problems.

I would not be too surprised if running a split arrangement, but sharing the
front end on the server does increase reliability somewhat due to the
concurrency issues. But giving each user their own front end further reduces
any type of concurrency problem and ALSO reduces the network load.

Most of the reasoning here is based on common sense. Further, that common
sense is also backed up by what people have experienced. It is common place
to see corruptions and problems go away when splitting, and moving the front
end to each user.

And, I seen people run shared NON split mdb files for years without trouble
either! So, some do get away doing this, but it is not very good. Your
mileage will vary on this!

You can't find well written answer or cut and dry article to every thing you
ask. For example, if you fill a glass full of Pepsi, and then let it go,
that
glass will likely fall to the ground and likely break. Now, lets say you
fill a glass with Coke-Cola. Are you going to look on the web, or for
another article that says that a glass full Coke-Cola will fall to the
ground? No. As a professional, you will learn the fundamentals. In the
case of the glass, you will learn about gravity, and perhaps some math
formulas that tells you about acceleration.

The IT industry is not much different then the medical, or the Engineering
fields. If we could have a nice book with all the answers, then there would
be simply no need for consultants or doctors for that matter. Fact is, we
gather a base knowledge, and use that knowledge to answer questions. So,
when someone comes along to ask you if you fill a glass full of cream soda
pop and let it go...what will happen? This is a new question! If you have a
good knowledge base, then you can give a useful answer and tell that person
that the glass will fall when let go! This answer will be based on your
reasoning, not that some article tells you so! I would be most surprised
if I could find a article on the web that states that a glass full of orange
soda will fall to the ground when let go, but we can most certainly conclude
this based on our knowledge.

I could be rather un-fair here and twist this whole issue around, and ask
you to produce a article that suggests, or shows how to run ms-word from the
server. Fact is, no one would suggest to try and run and load word code base
from the sever. No one is suggesting to try and load and run Excel this way
either. So, I doubt we will find some articles that tell you not to do
this! However, based on your experience and exposure to computers and our
industry, do you have any commercial windows software that loads/runs the
code base *across* the network? You *likely* don't, so then why with every
other piece of software do you install it on the pc, but for YOUR software,
you don't want to follow and do what YOU have been doing all these years? I
mean, based on what the industry does, I don't see why you need to ask:

Why are copies of MDE FE's considered best Access 2002 practice?

In fact, it is standard fair in the industry to install and run the software
on EACH computer. So, you are kind of like asking for an example of the
cream soda pop glass! Those types of clear cut answers are not going to be
given to you. Life simply does not consist of a book with all the cut and
dry
answers. If life were such the case, then we would not need educated people
anymore!

So, for example why not have a application that changes the menu bars based
on Security?

Answer:
Because you can't find any other windows software that does this!
Further, making manuals (and screen shots), training the users, trying to
maintain the code etc is not workable. This list menu problems could go on
for a long time! However, since we rarely see windows application change the
menu bars on users, then do we really need a book, or a article telling you
not to do this? Again, based on your experience in the industry, exposure to
the software development process and developers, you will simply make a
reasoned conclusion on this issue (and likely not do it).

It would seem to me, that reducing network traffic should imply that you
will increase the reliability of the system. Again, do we really need a
article or example that reducing network traffic will increase reliability?
(this info does not just apply to ms-access!). Further, it is VERY important
to realize we are using a file share. A file share = risk! (much risk!).

I cannot stress that using a file share is risk!

Go to a FoxPro users group and ask who has not experienced a corrupted index
in a multi-user file share environment? Heck, fire up the web and search
engine and type in

FoxPro corrupted index

You will get a zillion hits. Note that we are now NOT talking about
ms-access. However, in both cases we are talking about running a file share
across a network. The whole idea of using a file share involves MUCH MORE
risk then using a client to server arrangement. (this should not be
a surprise).

And, to be honest, I can't remember the previous threads, but you likely are
using sql server here anyway? At least you data can't be damanged when you
do this (and, the office cd does include a free server based engine that
you can use in place of JET).

In fact, some companies that want to reduce problems will simply NOT allow
people to run their software in a file share mode. Right now, even Microsoft
is kind of pushing this concept, and wants people to use a true client to
server setup in place of JET (and, I don't blame MS, as why not create a
system that reduces the risk of problems? If people will not learn to drive
standard transmission cars, then we have to give them all automatics. To
MS's credit, at least they do ship a free client to server engine with
ms-access. Using this engine will eliminate ALL problems of currption
of data).

Like any industry, the body of knowledge is both craft and learning. You
can't
learn all this stuff in a day. If you could, then consulting industry (or
any professional industry) would not exist.

The best you can do is to build up a knowledge base of fundamental concepts,
and the come up with a answer to your questions based on that knowledge
base.

This idea of ready made answers don't work in Engineering, medicine, or our
IT industry for that matter.

So, based on what the industry generally does, and based on a fact
of a file share, using a split mde makes sense as a *good* practice.

It is not a MUST do...but a just a real good idea, and it will
increase the reliability of your system.

Good luck.

For sure, the fact that you like to press on these issues
shows very much that you do in fact want answers, and
you obviously like getting to the bottom of things!.
This is good!...not bad!
 
V

Van T. Dinh

Few times a have seen corruptions due to shared Front-End, the corruptions
tend to be "Unrecognised database format". However, IIRC correctly, there
are other types of corruptions as well. OTOH, I have seen an unsplit A97
database used by 5+ users without any problems for over 2 years.

I am not sure your question about validation. If you are talking about data
validation, then it doesn't matter whether it is a shared Front-End or not:
data is still have to be validated. If you meant about keeping all copies
of the Front-End updated to the same (latest) version, then Tony Toews has
already posted the link to his AutoFE and you can use it to keep all copies
of the Front-End up-to-date.

--
HTH
Van T. Dinh
MVP (Access)



 
T

Tony Toews

Stephen J. Levine MD said:
What is the nature of the corruption that you are seeing?
I have been meaning to ask you that.

I don't recall exactly what it was. I was in too much of a hurry to
replace it and reconfigure things, all over the phone, to worry about
the exact details.
Also, what has your experience been with multiple access
sessions with the same FE?

None, as I've never done that.

Tony
--
Tony Toews, Microsoft Access MVP
Please respond only in the newsgroups so that others can
read the entire thread of messages.
Microsoft Access Links, Hints, Tips & Accounting Systems at
http://www.granite.ab.ca/accsmstr.htm
 
T

Tony Toews

Arvin Meyer said:
Very simply, sending a front-end over the network, increases the number of
packets send and received. The more packets moved, the more data collisions,
and the greater the chance of a malformed packet.

I'm going to disagree with this conclusion when it comes to FE
corruption. I suspect it has more to do with Access's temporarily
updating of filters and other internal stuff in forms and reports
while being shared.
If a front-end file gets
corrupted, or bloated, or anything happens to it, all one needs do is
download a new copy from the server.

Agreed to this.
Most corruption occurs in OLE fields (Memo, hyperlink, bound and unbound OLE
fields) because the data is not stored on the same page. Pointers are used
to write a disk address to the table. When any other index is corrupted,
it's relatively easy to rewrite. (That's part of what compacting does). Not
so with pointers. They are only easily rewritten if they were correct in the
first place. If the disk address cannot be found, or it doesn't match the
record in the pointer, the row or page is marked as corrupted.

Agreed to this as well. But this is tables not FEs as Stephen points
out.
I now ignore any requirement to run a front-end of any kind
(other than ASP) over a server.

And ASP, of course, runs on the server presenting just finished HTML
pages to the client.

Tony
--
Tony Toews, Microsoft Access MVP
Please respond only in the newsgroups so that others can
read the entire thread of messages.
Microsoft Access Links, Hints, Tips & Accounting Systems at
http://www.granite.ab.ca/accsmstr.htm
 
T

Tony Toews

Another issue that one could raise about individual copies
of the front end on PC's would be the issue of, in a
pharmaceutical setting, the need for each to be
validated. I wonder if anyone from the pharmaceutical
industry is on this Newsgroup and would care to comment.

Define validation.

Why is the pharmaceutical industry different than the health care
industry in the U.S. with this new HIPAA law I've read about?

Is code signing, now available in A2003, sufficient?

Tony
--
Tony Toews, Microsoft Access MVP
Please respond only in the newsgroups so that others can
read the entire thread of messages.
Microsoft Access Links, Hints, Tips & Accounting Systems at
http://www.granite.ab.ca/accsmstr.htm
 
T

Tony Toews

Van T. Dinh said:
OTOH, I have seen an unsplit A97
database used by 5+ users without any problems for over 2 years.

From what I've seen A97 is much more robust about sharing unsplit or
FE MDB/MDEs than A2000 is.

Tony
--
Tony Toews, Microsoft Access MVP
Please respond only in the newsgroups so that others can
read the entire thread of messages.
Microsoft Access Links, Hints, Tips & Accounting Systems at
http://www.granite.ab.ca/accsmstr.htm
 
J

John Vinson

I am not sure your question about validation.

Having had to deal with "application validation" in my 17 years in the
pharmaceutical industry, I suspect it has nothing to do with table
validation rules, but rather with software validation - meeting the
legal requirements that any computer programs dealing with clinical
data must be "validated" to FDA standards.

Since (in my experience) these standards were written on the
assumption that all computer programs are custom-written in Assembler
for the IBM/360 and can be validated instruction by instruction... or
at least the regulations give this distinct impression... quite a lot
of interpretation is needed! (Well, yes, I'm exaggerating, but that's
one reason I'm glad I'm no longer working for a big pharmaceutical
company).
 
L

Larry Linson

Van

Do you know what the nature of the corruption in the front
end was?

Generally, it is not possible to determine "the nature of the corruption" --
some, or all, of the database objects become unusable or unreadable, Access
tells you you need to compact and repair, and compacting and repairing may
or may not solve the corruption. If it does not, you can recover from a
backup copy, if you have one, or send it off to Peter Miller at
http://www.pksolutions.com, who has an excellent reputation for recovering
data from damaged Access databases.

As I am sure you know, Microsoft has not published the details of the
internal structure of MDB files, and the source code is not available
outside Microsoft. Thus, many of the details of corruption that you ask for
are simply not available.
Another issue that one could raise
about individual copies of the front end
on PC's would be the issue of, in a
pharmaceutical setting, the need for
each to be validated. I wonder if anyone
from the pharmaceutical industry is on this
Newsgroup and would care to comment.

I'm sorry, but it is truly a waste of time and effort to grasp at straws
trying to justify having multiple users logged in to the same front end.
Every user should be using, and can be forced to use, a copy of the same
front-end database application. If industry standards require that front-end
be "revalidated" in the environment in which it is going to execute, then
that is the price of using this tool in that industry.

As has been pointed out, a web-based database application, using .asp pages,
or ASP.NET, can be used in place of an Access multiuser database
application. It will definitely take more time and effort to develop and
test. Its thin-client (browser) user interface will have less
capability/flexibility than the rich-client Access multiuser, but it will
let each user execute the same code, though perhaps it might be necessary to
validate on the user's PC because each might be using a slightly different
level of Service Packs on the browser, or, indeed, different browsers
altogether.

Larry Linson
Microsoft Access MVP
 
A

Arvin Meyer

I've only seen 3 corrupted front-ends in the 11 years I've been developing.
One of them was on an Access 95 database I was developing, and the other 2
were on a colleague's Access 2000 databases which he was developing on the
server. All 3 corruptions occured during development. The only corrupted
production front-end database I've ever seen was an unsplit one running from
the server. As a result, I cannot consider myself an expert in Front-end
corruption. I've seen about 12 to 15 back-end databases corrupted. All but 1
of those had a front-end running from the server, not the workstation. The
numbers would indicate that running a front-end on the server cause the
overwhelming number of corryptions that I've seen. Also, in only a few of
the production cases (maybe 4) could I determine that either a human or a
piece of hardware wasn't directly responsible. The cause on those 4 is
undetermined.

Sooo..... out of 15 to 18 total corruptions over an 11 year period, a third
of which were in 1 year and directly associated with a bad wireless card,
only 4 of them *could* have been caused by shared instances of filters. The
others most definitely were users or hardware.

The corruption occurs in the backend, but is caused by an event normally
coming from the front-end. Why? Your guess may be better than mine.
--
Arvin Meyer, MCP, MVP
Microsoft Access
Free Access downloads:
http://www.datastrat.com
http://www.mvps.org/access
 
S

Stephen J. Levine MD

Define validation.
Why is the pharmaceutical industry different than the health care
industry in the U.S. with this new HIPAA law I've read
about?

HIPAA basically deals with ensuring confidentiality of
information.

The Pharmaceutical Industry and its regulatory bodies are
more concerned about process control, which is the
ability to guarantee reproducibility of processes, and
thru this reproducibility the ensuring of potency and
purity (I am using the lingo of the trade, but if you
take your time with it, I think you will understand).

Validation of a process means evaluating the ability of
the process to produce consistent and desirable results.
For example, validation of a procedure to produce, let's
say, the antihistamine chlorpheniramine, would consist of
testing the procedure enough times and under enough
different conditions as necessary to ensure that the end
product is indeed chlorpheniramine, it is of the desired
concentration = strength = potency and that it is of the
desired purity with no more than acceptable levels of
undesired contaminants.

In the area of pharmaceutical and medical device
software, validation consists of testing the software
under a wide variety and severity of conditions to see if
it performs acceptably, yielding the correct outcomes.
Very important to this is the testing of a wide variety
of scenarios. It is recognized that software failures
can be unexpected,i.e. occur without warning, and be
potentially severe. Of interest is that one of the
incidents that precipitated the move to regulate software
was a program operating a device that irradiated tumors
in patients. A failure of this software occurred when a
tech entered, by mistake, 10 times the desired dose of
irradiation and then corrected it to the correct dose.
Unfortunately, the correct dose never registered and the
patient received 10 times the correct dose of irradiation
and subsequently died.

If you are writing medical device or pharmaceutical
software for sale, you must apply for 510k device
approval. The FDA will examine the processes by which
you create software and what mechanisms you have in place
to prevent, if possible, software errors. Because errors
are unavoidable, they want to see that you have
procedures and processes in place whereby post-release
errors are reported, how and when you notify the userbase
of serious problems, and how, when you have an error, you
improve your processes to avoid errors of that type in
the future (so-called CAPA -- Corrective and Preventative
Action).

Of course, as has been stated in the pharmaceutical
literature for the past 20 years, quality cannot be
tested into software, but instead must be designed in.
As such, when you design and create medical and
pharmaceutical software, you want to do it in a manner
that lowers the risk of error. My own and others' advice
is to keep the design simple and modular, using object-
oriented principles to control dependencies of
components, avoid logic constructs that are at increased
risk of coding errors, such as if not a and if not b
(which can be easily miscoded as if not a or if not b).
The way you design your database is important and you
should try to normalize it as much as possible ( a
normalized database is a happy database <g>). I
recommend that you try to stick to the raw data instead
of triggering derived data and use software logic to
provide a virtual interpretation of this data as you need
it (I liken it to keeping the moving parts at a
minimum). In medical device and pharmaceutical software,
you need to get it right the first time.

And now, as a friend of mine says, you know as much as I
know. I kind of did this ad hoc so I hope it makes sense.

sjl
 
S

Stephen J. Levine MD

"... Its thin-client (browser) user interface will have
less capability/flexibility than the rich-client Access
multiuser, but it will let each user execute the same
code, though perhaps it might be necessary to validate on
the user's PC because each might be using a slightly
different level of Service Packs on the browser, or,
indeed, different browsers altogether."

Larry

You raise a very interesting point here, and one that I
agree with. I came from a period of the mainframe where
the user could only execute one program at a time and was
restricted to the interface that you programmed for him.
In a pharmaceutical setting, this was great because you
only gave the user the options you wanted him to have.

In a production environment, I feel the PC offers too
many options. I don't know how many times I have seen
users play with colors and fonts and we have had
instances where font changes have changed the way printed
forms have come out to where they are no longer
acceptable because columns do not match up, lines wrap or
are truncated unacceptably or verbiage spills onto an
extra page.

This is one of the reasons why, in our industry, we are
moving toward web-interface type applications. And
because we are a single organization, we can standardize
on a single web browser, which, for our institution is
ie6.

Now, the department for which I wrote my application is
not in operations, although eventually we will set it up
that the entry of the some of the data will be done via
operating personnel. However, those working with and
analyzing the data will be higher level personnel whose
job is to troubleshoot and recommend corrective actions.

Thanks for sharing your views. I am learning a lot from
you.

sjl
 
G

Guest

John

You hit the nail on the head. Validation is of software
functionality. And its reliability, particularly where
it is involved in process control.

I would really like to learn more about what you know of
validation in the pharmaceutical industry if you would be
willing to share offline. I have a lot of questions.

sjl
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top