dotnet windows forms vs. Access

T

Thomas

Rob, great, winter is quite suitable if nobody else would have time
before that.


Regards,
Thomas
 
T

Tony Toews [MVP]

Thomas said:
I have a proposal for trying to solve this dilemma: if you have one or
two days in the following month or two, we can imagine simple fictitious
business application (it can be also some usable problem which we can
sell then as a product, too :)).

Or how about for a non profit group that has a geniune need for such a solution.

Tony

--
Tony Toews, Microsoft Access MVP
Please respond only in the newsgroups so that others can
read the entire thread of messages.
Microsoft Access Links, Hints, Tips & Accounting Systems at
http://www.granite.ab.ca/accsmstr.htm
Tony's Microsoft Access Blog - http://msmvps.com/blogs/access/
 
J

Jim Rand

A simple question has been posed - does it make sense to move from MS Access
to .NET?

Back in 1999, I rewrote an application for the third time. The first
version was in DBase II and the second version was in Progress. The plan for
version 3 was VB on the frontend and Sybase SQL Anywhere on the backend.
With 10% of the project completed, I was through 25% of the money - $10,000
at this point. (Math quiz: what was the budget and what was the projected
cost overrun?)

Two questions had to be answered. 1) How did I screw up the budget so
terribly and 2) how could I fix it?

The answer to (1) is that I budgeted the project based on what I could do it
in MS Access. The reason that I didn't use Access is that for multi-user
applications using bound forms and linked tables across the network, Access
stinks. It is slow and unreliable.

So what to do, what to do? The obvious question was to fix the data
marshalling in MS Access so that it is fast and reliable.
Did that. My Access applications now run 125 times faster on the network
with bound forms to linked tables. Starting over with Access on the
frontend and Sybase on the backend , the total project cost (including the
initial $10,000) came in at $44,000. Speed over the network is
instantaneous and it has never broken.

So why the shift to .NET? 1) My Access middleware DLL is written in VB and
utilizes RDO both of which are nearing the end of life. 2) My middleware
only works with Sybase, though with some work, it could be made to work with
SQL Server. 3) Access looks old. 4) Microsoft is pushing .NET.

The project I'm now finishing up uses .NET, C# and SQL Server. The budget
was set at 1.3 times a comparable Access project. It appears that the final
cost will come in at budget.

The pluses: The database is in New Jersey with the frontend up here in
Maine. Performance is unbelievably fast - you wouldn't know the database
was off site over a wide area network. It doesn't break. It looks great.
Plus, the project is coming in on budget.

The minuses:
- Long learning curve.
- Out-of-the-box doesn't cut it - I use the Developer Express controls
(fantastic).
- The data designers are pretty much useless for the sql logic - wrote my
own which work.
- There are some nasty gotchas in ADO.NET requiring work-a-rounds.
- VS 2005 is just plain painful to design forms - dropping, renaming and
binding controls is way too slow.
- Microsoft is tardy in getting service packs out to fix framework bugs.

So back to the initial question, "does it make sense to move from MS Access
to .NET?" Even with some of the crap in Visual Studio, the outcome is
positive and well worth the investment.

My clients love the results which means business will be good for some time.
New product development to replace aging Access is proving to be a smart
move.
 
S

Sylvain Lafontaine

This only tell us that you have started a project in VB without knowing VB.
I never saw someone starting a project with a technology that he (/her/them)
doesn't know without going into massive cost overruns.

As to the data marshalling that you have done, I would be very surprised if
you can't do the same thing with either VB or .NET. In fact, I would be
even more surprised if you can't do any better with NET than you did with
Access.

NET has been designed to give you the fastest response as possible with
every kind of backend (JET, SQL-Server, Oracle, heterogeneous, etc.) over
any network (Local, LAN, WAN) as possible. If you make a test that gives a
slower response with .NET than with anything else, it's not because NET is
slower, it's because you have made one or more errors somewhere.

Don't forget that with practically any kind of new technology, you will
usually need at least one year of experience to have some mastery of it.
 
J

Jim Rand

Sylvain Lafontaine said:
This only tell us that you have started a project in VB without knowing
VB. I never saw someone starting a project with a technology that he
(/her/them) doesn't know without going into massive cost overruns.

The VB6 databinding killed me. .NET databinding is so much better and
explains why the project costs for Access and .NET are in the same ballpark.
My experience parallels others - VB 6 takes about 3 to 5 times longer to
write compared to Access. .NET is coming in at 1.3 times which, when you
look at the other benefits, is very acceptable.
As to the data marshalling that you have done, I would be very surprised
if you can't do the same thing with either VB or .NET. In fact, I would
be even more surprised if you can't do any better with NET than you did
with Access.

ADO.NET disconnected datasets mirrors my 1999 Access/VB RDO architecture
with the Fill and Update methods. That's one of the reasons I'm so
comfortable working with ADO.NET.
NET has been designed to give you the fastest response as possible with
every kind of backend (JET, SQL-Server, Oracle, heterogeneous, etc.) over
any network (Local, LAN, WAN) as possible. If you make a test that gives
a slower response with .NET than with anything else, it's not because NET
is slower, it's because you have made one or more errors somewhere.

Couldn't agree with you more. With multi-threading, you can give the
illusion of even faster speed.
Don't forget that with practically any kind of new technology, you will
usually need at least one year of experience to have some mastery of it.

Plan on at least a year. But as I say, replacing aging Access with .NET
has turned out be a smart business move.
 
P

(PeteCresswell)

Per Robert Morley:
The performance of VB.NET also has yet to equal that of VB6/VBA, so that is
a consideration as well. The only way to get acceptable performance

But isn't perceived performance mainly how fast the app can get
stuff in and out of the back end? Given that, wouldn't they all
be about the same?
 
P

(PeteCresswell)

Per Jim Rand:
So what to do, what to do? The obvious question was to fix the data
marshalling in MS Access so that it is fast and reliable.
Did that. My Access applications now run 125 times faster on the network
with bound forms to linked tables.

Can you list some of the things you did to make that improvement
in marshalling?
 
B

Baz

So you budgeted to do a project in Access and then you tried to do it in VB
for the same budget (duh!), and because you couldn't this is somehow Access'
fault?

Access using bound forms and linked tables for multi-user apps is so slow
and unreliable that I am always getting panic-stricken support calls from my
many customers. Some of them I hear from every year! Mind you, I do use
SQL Server, not Sybase.

I think your post tells us a lot more about you than it does about any
technologies.

Jim Rand said:
A simple question has been posed - does it make sense to move from MS Access
to .NET?

Back in 1999, I rewrote an application for the third time. The first
version was in DBase II and the second version was in Progress. The plan for
version 3 was VB on the frontend and Sybase SQL Anywhere on the backend.
With 10% of the project completed, I was through 25% of the money - $10,000
at this point. (Math quiz: what was the budget and what was the projected
cost overrun?)

Two questions had to be answered. 1) How did I screw up the budget so
terribly and 2) how could I fix it?

The answer to (1) is that I budgeted the project based on what I could do it
in MS Access. The reason that I didn't use Access is that for multi-user
applications using bound forms and linked tables across the network, Access
stinks. It is slow and unreliable.
<snip>
 
B

Baz

Where do I start? How about the bloated, festering dotnet framework,
redolent as it is with the promise of versioning nightmares in years ahead?
How about not being able to write unmanaged code in anything but C++? How
about it being so damned slow? How about the IDE being so damned awful?
How about MS's sheer, arrogant supidity in trying to kill the world's most
popular development tool (I refer of course to VB6) without even providing
backward compatibility?

That's a few gripes about the big stuff, I could think of a lot more if I
tried. There's LOADS more stuff at the small level, my favourite in
2002/2003 was the godawful combo box which couldn't even do what a VB6 combo
could do, let alone an Access combo. Sure, you can build your own controls,
which is exactly what I did to get a half-decent combo box (or you can buy
'em if you didn't already think you'd wasted enough dosh on this garbage),
but that's what you have to do all the time: so much of it doesn't quite
work as you would expect or want that you spend all your damn time
finding/creating workarounds and alternatives.

Yeeeuk! I don't blame you for puffing this stuff - you've got a living to
make - but frankly it makes me ill.
 
J

Jim Rand

These are the relative speeds for connecting with backend databases (source:
Microsoft).

1.0 ODBC API
1.1 Query pass-through (readonly way to get the data)
1.1 RDO (remote data objects) - Microsoft's only comment was "it is very
fast".
125 - 150 Access dynasets via linked tables.

So the answer is to get the data from the backend database via query
pass-throughs and to write changes back with RDO.

Implementation:

I have a program mdb on the client which creates a temporary data mdb on the
client. The middleware is written as a VB DLL for speed (compiled) and for
use of RDO. Prior to opening a form, the middleware
- Creates tables in the data mdb using model tables saved in the program
mdb as a templates
- Links to data tables in temp.mdb (local link - not over the network)
- Creates query pass-throughs for each table
- Creates append queries using the query pass-throughs as its source to
load the local linked tables.
- Fires the append queries (got the data now - Access is totally faked
out - both for design and runtime)

The sql for each query pass-through is stored in the backend database. Here
is a sample:

SELECT B.BudgetID, B.ChartAcntID, B.BudgetYear, B.January, B.February,
B.March, B.April, B.May, B.June,
B.July, B.August, B.September, B.October, B.November,
B.December,
CAST(B.TS AS CHAR) AS ExactTS
FROM Tiger.Budget AS B
WHERE B.BudgetYear = ::01
ORDER BY B.ChartAcntID

BudgetID is the auto increment primary key issued by the database. B.TS is
a timestamp cast as char for concurrency. (Note: in .NET with SQL Server,
you would cast B.TS as an int since SQL Server's timestamp is really a
binary row version). ::01 is a placeholder for parameter tokens.

For children of parent tables, you would refire the append queries on the
Form_Current event - i.e., you only have the invoice detail locally for one
invoice.

*** In the parent form ***
' Retrieve line items this order from Sybase and requery subform
Set f = Me.zInvoiceLineitems.Form
f.RequeryData lInvoiceID:=Nz(Me.InvoiceID, 0)
Set f = Nothing

*** In child form ***
ReDim vNewTokenValues(1)
vNewTokenValues(1) = lInvoiceID
goTblsSybase.Item("InvoiceLineItem").RefreshData
vNewTokenValues:=vNewTokenValues
Me.Requery


Writing the data back is a little more tricky. To do this, I have a
CFormDataHandler class which each form instantiates. Public methods include
AddTimerItem, Form_AfterDelConfirm, Form_AfterInsert, Form_AfterUpdate,
Form_Current, Form_Delete, Form_Timer, Form_Unload, Initialize and
RefreshData. Each form and each subform delegates activity back to the
oFormDataHandler object. The middleware figures out what the insert,
delete, and update statements are from the initial select statement - sort
of like .NET's command builder but smarter.

The framework is complex, but once it is written, it's really simple to use.
Access is totally faked out. It becomes a really fast single user database
working only with local data. Since you are not relying on linked tables
across the network, reliability goes to 100% - it never breaks. Plus all
access to the backend database is at the ODBC API speed.

Since the initial project back in 1999, I've reused this framework on
numerous projects. For one, we had to connect via a really slow fractional
T-1. After making an entry in a cell in a grid, the client pressed the
down-arrow key which caused the writeback to the database via the framework.
The time to get to the next row was instantaneous. The client said "wow,
its like the server was right there".

This is the magic of disconnected data. Think about it. That is what
ADO.NET is all about with its disconnected datasets.

They say "real programmers don't use bound forms". From my experience,
bound forms offer hugh productivty gains. Access has it as does .NET.

Jim
 
P

(PeteCresswell)

Per Jim Rand:
The framework is complex, but once it is written, it's really simple to use.
Access is totally faked out. It becomes a really fast single user database
working only with local data.

You lost me on how you get the data from the back end into your
temp tables. Is the back end SQL server?

Other than that part, it sounds like my SOP for running an app:
copy everything to temp tables on C: and then use bound forms
against the temp tables.

Seems to me like it's a happy compromise: the convenience and
services of bound objects, but close to the speed/lack of
contention/safety of unbound objects.

For a .MDB back end, I just use Append queries to write to temp
tables. For SQL Server back ends, I'll write a stored procedure
that only hits the server once, but comes back with a data stream
for each temp table I need to populate.

I experimented with caching drop down rowsources once with a .MDB
back end, but I'm not sure that it helped performance any. Last
SQL Server back end I did, caching the dropdowns helped
noticeably at form-open time.
 
G

Guest

yes, you are wasting your time

Microsoft is going to make a new .NET version of ADP with the next release--
so .NET is not a total waste of time

but it is obvious that ADP is a much much much better platform than .NET
 
S

Sylvain Lafontaine

Where you should start? Well, you should start with VB6, VC6, COM/DCOM/MTS,
ActiveX and the registries base themselves!

In case you forgot, this whole system was already in the process of
imploding under its own weigth 10 years ago. Doing a program today is no
longer the process of aligning a few text boxes, comboboxes and listboxes on
a simple form, tabbed form or continuous form on an isolated machine,
without internet and maybe even without a LAN.

Take a look at all the criticisms that were made about VB6, unmanaged C++,
the DLL hell, the total lack of security and the constant corruption of the
registries base - to name only that - and now think about what would be the
situation if instead of dumping that into the garbage bin, MS would have
increasing the size of these dinosaurs by a factor of at least ten to one
hundred. Increasing the size of the registry base by a factor of at least
one hundred is not only a reasonable assumption but it's quite likely a
gross under-estimation of the true space that would have been required to
cover the actual possibilities of the .NET framework using these old
technologies.

I agree with you that even on a Core 2 Duo, running the actual Framework 2.0
is slow but if you would have tried to do the same thing with
COM/DCOM/ActiveX - as it was with VB6 - probably that not only your machine
will be running slow but probably that it would have imploding into a black
hole.

Buying a dual core is now standard and they will soon be replaced with quad
core as the basic developer machine. (Probably that the price of the quad
core will be cut by two this autumn, if not before.). Next year, you will
start to see machines with 8 cores and from 16 to 32 Gigs of memory running
Vista 64 bit as the basic machine bought by most developers. With such
power running in 2008, do you really think that people wanted to keep
VB6/VC6 - with a few more gugus here and there - as their main developer
tools?

VB6 was the most popular tool in the past years? So do were DBase3, Lotus
1-2-3 and Word Perfect. Now, all these tools are gone because Ashton-Tate,
Lotus and Word-Perfect Corporation were believing that whence you have
reached a market share of 90% and more, you don't have to evolve anymore and
your base of loyal users will remain with you for eternity.
 
R

Robert Morley

I'm surprised you're so emphatic about this, Sylvain. That's not like you,
from what I've seen.

I have to go with Baz on this one. Many, MANY people are moving away from
the .NET framework (Delphi seems to be a popular choice) because VS
2002/2003 were very slow, and while VS 2005 offers some improvements, I
gather, it's still often sluggish when using managed code, and many VB6
developers have no interest in learning C++ as the only unmanaged
alternative. And, of course, there's the nightmare of having to
redistribute the megalith that is the Framework itself.

As for how fast it will eventually be on upcoming OS's/hardware, designing a
system for stuff that's not out yet isn't usually the best idea, as people
have to use this stuff today on their current OS and hardware, not a year or
two (or more) from now. I'm also a firm believer that the hardware
shouldn't have to compensate for the underlying speed of the application.

I completely disagree that VB6 with COM/DCOM/ActiveX would be a "black
hole". Most tests have shown that in fact, it's significantly faster than
..NET in most cases. There are certainly some tests where .NET outperforms
VB6, and it's a lot more capable in some areas like multithreading, but most
of the articles I've read have put VB6 as 1.5 - 2x faster than the
equivalent .NET code.

But regardless of any of that, even if VS 2005 works faster in any given
scenario, Access is still VBA-based, and as such, you'd still be using COM,
etc. *If* you're going to use Access, which many people still prefer to
..NET solutions, it makes sense that you'd supplement it with VB6/COM if
necessary, rather than using .NET and having to go through Interop.

I will agree that it looks like VB6 is likely to die eventually, and
unfortunately, Microsoft seems to have no understanding of why people are
complaining about that fact. An unmanaged version of VB with greater
backwards compatibility has been suggested, and requested by thousands
(http://classicvb.org/Petition/), but again, it seems unlikely.
Nevertheless, VB6 continues to be supported on Vista, at least in some
fashion, so I don't really see much of a problem continuing to use that for
the time being.

All that said, if .NET is fast enough for you (and as I say, I gather 2005
made improvements in that area), then certainly there's a lot to be said for
it in terms of enterprise development, Internet development, etc. So, as
they say, "if it works, don't knock it." For myself, however, the
sluggishness was intolerable, and at least so far, I'm sticking with VB6
DLLs to back up my Access projects where necessary.


Rob
 
P

(PeteCresswell)

Per Rick Brandt:
How do you deal with concurrency? Presumably a long time can pass between
retreiving data into the temp tables and all changes being written back to
the source. How do you know some of those same records have not been
changed by other users?

When people talk about the benefits of "disconnected" data I never
understand how they can deal with this part.

I don't. The term of art that I've heard is "Last In Wins".

If I had to do something in that vein with a .MDB, I guess it
would just be a field in the record - something like a LAN UserID
and a timestamp.

But that's a whole new layer of complexity and programming effort
and the kind of apps I do aren't really exposed to concurrency
because of the small number of users.

So, I guess one cost of my approach is loss of JET's monitoring
of concurrency issues.
 
P

(PeteCresswell)

Per "Sylvain Lafontaine" <sylvain aei ca (fill the blanks, no
spam please)>:
Buying a dual core is now standard and they will soon be replaced with quad
core as the basic developer machine. (Probably that the price of the quad
core will be cut by two this autumn, if not before.).

Tangential OT Question: Given that I'm running XP Pro and 32-bit
MS Office, would I notice any speed increase from going to a dual
or quad core machine?
 
D

David W. Fenton

If I had to do something in that vein with a .MDB, I guess it
would just be a field in the record - something like a LAN UserID
and a timestamp.

But that's a whole new layer of complexity and programming effort
and the kind of apps I do aren't really exposed to concurrency
because of the small number of users.

So, I guess one cost of my approach is loss of JET's monitoring
of concurrency issues.

Then what's the benefit of editing local temporary tables? With that
small a user population, forms bound to the linked tables ought to
function just fine.

What problem are you trying to solve with all the complexity you
introduce with your temp tables?
 
P

PMK

yes, you are wasting your time

Microsoft is going to make a new .NET version of ADP with the next release--
so .NET is not a total waste of time

but it is obvious that ADP is a much much much better platform than .NET

..NET version of an Access Data Project? I don't get it. Tell me more.

Peter
 
P

(PeteCresswell)

Per David W. Fenton:
Then what's the benefit of editing local temporary tables? With that
small a user population, forms bound to the linked tables ought to
function just fine.
What problem are you trying to solve with all the complexity you
introduce with your temp tables?

I don't see it as all that complex. More code, yes.... but far
from rocket science.

I'm not pushing it as the end-all-be-all, but it works for me.

What I *think* I get is:
------------------------------------------------------------
1) Maybe a little more robustness.

Nobody's "in" the back end for more than a fraction
of a second - so if somebody goes to lunch or home
leaving the app up and running on their PC, they're
not locking anybody else out who wants to edit
the same block.

If somebody's PC crashes or a NIC goes haywire
there's less chance of the back end getting
corrupted.

Remember - I said "Maybe"....


2) Control.

I like my forms to have two modes: Browse and Edit.
When big bucks are at stake and people are banging
on a form all day every day, I believe having to
click "Change" before being able to update any
fields allows fewer errors than if somebody can update
fields just by putting an elbow on the keyboard.

I know this can be achieved with a conventional bound
form - and I even tried a few like that. But every
time I tried it, in the end something came up that
it couldn't handle - can't recall what bc it was
such a long time ago; but once I started gaming
the system so-to-speak the complexity and subtleties
needed seemed to me tb just as bad as the additional
code used by my methods.

Finally, when it's time for interdependent edit checking
(e.g. if BondType=Debt, then InterestAccrualType
must=Variable) I find my approach works for me.

Never did figure out how to do that with bound forms.


3) Transactions

When there are multiple child tables involved in a given
form/edit, I prefer to wrap everything in a transaction
so that if part of it goes south, nothing gets updated.


4) Adaptability.

The guys I serve tend to evolve their applications as
circumstances arise. One of the reasons they called
on me and not IT was that they don't have the time or
inclination to spell everything out in a spec, sign it
in blood, and then take an hour or two out of their
work day to go before some committee every time they
want to make a change.

So if I go with the work-table-based form from the get-go,
and they come up with something like interdependent edit
checking, I know I can handle it without re-architecting
the whole form.
------------------------------------------------------------


One more time: I'm not presenting this as
The-Good-Right-And-Holy-Path or The Revealed Truth or
anything like that.

It's just something that works well enough for me and using it
most of the time gives my apps an almost boring consistency -
i.e. I can go into one two years later and it's all there
right up front in the code. I don't have to try to remember what
little tricks I was performing to get something done.
 
R

Rick Brandt

(PeteCresswell) said:
Per David W. Fenton:


I don't see it as all that complex. More code, yes.... but far
from rocket science.

I'm not pushing it as the end-all-be-all, but it works for me.

What I *think* I get is:
------------------------------------------------------------
1) Maybe a little more robustness.

Nobody's "in" the back end for more than a fraction
of a second - so if somebody goes to lunch or home
leaving the app up and running on their PC, they're
not locking anybody else out who wants to edit
the same block.

No, but they could very well wipe out that persons changes which is exactly
what is avoided by record locking. Locking records that you are editing is
a good thing. Making that lock affect as few records as possible is even
better. That is what bound forms do.
If somebody's PC crashes or a NIC goes haywire
there's less chance of the back end getting
corrupted.

Since that is only a problem if it happens during the actual commit to disk
it seems to me that your method makes that even more likely because you are
committing large blocks of changes at a time instead of numerous (fast)
smaller ones.
2) Control.

I like my forms to have two modes: Browse and Edit.
When big bucks are at stake and people are banging
on a form all day every day, I believe having to
click "Change" before being able to update any
fields allows fewer errors than if somebody can update
fields just by putting an elbow on the keyboard.

I know this can be achieved with a conventional bound
form - and I even tried a few like that. But every
time I tried it, in the end something came up that
it couldn't handle - can't recall what bc it was
such a long time ago; but once I started gaming
the system so-to-speak the complexity and subtleties
needed seemed to me tb just as bad as the additional
code used by my methods.

I have lots of forms with that feature and don't recall ever having any
problems with it.
Finally, when it's time for interdependent edit checking
(e.g. if BondType=Debt, then InterestAccrualType
must=Variable) I find my approach works for me.

Never did figure out how to do that with bound forms.

Seem like BeforeUpdate would handle that. Better yet would be rules at the
database level.
3) Transactions

When there are multiple child tables involved in a given
form/edit, I prefer to wrap everything in a transaction
so that if part of it goes south, nothing gets updated.

Okay, that's one I will concede :)
4) Adaptability.

The guys I serve tend to evolve their applications as
circumstances arise. One of the reasons they called
on me and not IT was that they don't have the time or
inclination to spell everything out in a spec, sign it
in blood, and then take an hour or two out of their
work day to go before some committee every time they
want to make a change.

So if I go with the work-table-based form from the get-go,
and they come up with something like interdependent edit
checking, I know I can handle it without re-architecting
the whole form.

Don't really see your point here.
One more time: I'm not presenting this as
The-Good-Right-And-Holy-Path or The Revealed Truth or
anything like that.

It's just something that works well enough for me and using it
most of the time gives my apps an almost boring consistency -
i.e. I can go into one two years later and it's all there
right up front in the code. I don't have to try to remember what
little tricks I was performing to get something done.

I can see it working and even working well, but I can also see users
occassionally reporting "I did a bunch of changes yesterday and when I look
at the records today all of my changes are gone". In most database
applications the developer could simply roll their eyes and say "yeah,
sure". With your system it is an expected occurrence (however rare it might
be) and the only answer is "You'll have to enter them again".
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top