Legacy code, mORMot, and database sharing
It is pretty much possible that you would have to maintain and evolve a legacy project, based on an existing database, with a lot of already written SQL statements - see Legacy code and existing projects.
For instance, you would like to use mORMot for new features, and/or
add mobile or HTML clients - see Cross-Platform
In this case, the ORM advanced features - like ORM Cache or BATCH process, see BATCH sequences for adding/updating/deleting records - may conflict with the legacy code, for the tables which may have to be shared.
Here are some guidelines when working on such a project.
To be exhaustive about your question, we need to consider each ORM CRUD
We may have to divide them in three kinds: read queries, insertions, and modifications of existing data.
Retrieve() methods, the ORM cache can be tuned per
table, and you will definitively lack of some cache, but remember :
- That you can set a "time out" period for this cache, so that you may still benefit of it in most cases;
- That you have a cache at server level and another at client level, so you can tune it to be less aggressive on the client, for instance;
- That you can tune the ORM cache per ID, so some items which are not likely to change can still be cached.
BatchAdd() methods, when using
the external engine, if any external process is likely to INSERT new rows,
ensure you set the
property to TRUE, so that it will compute the next maximum ID by hand.
But it still may be an issue, since the external process may do an INSERT during the ORM insertion.
So the best is perhaps to NOT use the ORM
BatchAdd() methods, but rely on dedicated INSERT SQL statement,
e.g. hosted in an interface-based service on the server side.
Update() Delete() BatchUpdate() BatchDelete()
methods, they sound safe to be used in conjunction with external process
modifying the DB, as soon as you use transactions to let the modifications be
atomic, and won't conflict any concurrent modifications in the legacy code.
Perhaps the safer pattern, when working with external tables which are to be
modified in the background by some legacy code, may be to use server-side
interface-based services - see Client-Server services via
interfaces - for any process involving external tables which may be
modified by another process, with manual SQL, instead of using the ORM "magic".
But it will depend on your business logic, and you will fail to benefit from
the ORM features of the framework.
Nevertheless, introducing Service-Oriented Architecture (SOA) into your application would be very beneficial: ORM is not mandatory, especially if you are "fluent" in SQL queries, know how to make them as standard as possible, and have a lot of legacy code, perhaps with already tuned SQL statements.
Introducing SOA is mandatory to introduce new kind of clients to your
applications, like mobile apps or AJAX modern sites: you could not access
directly the database any more, as you did with your legacy Delphi application,
and RAD DB components.
All new features, involving new tables to store new data, would still benefit of the mORMot's ORM, and could still be hosted in the very same external database, shared by your existing code.
Then, you will be able to identify seams - see Legacy code and existing projects - in your legacy code, and move them to your new mORMot services, then let your application evolve into a newer SOA/MVC architecture, without breaking anything, nor starting from scratch.
Feedback is welcome on our forum, as usual.