Friday, September 9, 2011

TRANSITIVITY AND VERSIONING - THE TIT PRINCIPLE

Obviously, nobody take look on the operational problems until the "thing happen". Then what? A problem is fixed. But let's consider the situation? Is it possible to avoid an operational problem, doing something like "tests"? The big scam on Web 2.0 is the idea of collaborationism, you: without anything good to do, will cover some ass that don't like to stay on front of the computer working to accomplish his/her job. The maxima of the idiot. Because one interesting point mainly now that we took a read on forums and discussions, is that "everything is a cannon of hypothesis not well formed". Like a JSON object never well formed, a weird individual "problematic", "old", complaining all time because "my format is not a standard".
But back to operation history, well operations have born and with operations "Renascence" came 2 models, OLTP and OLAP, but changing the scale, you arrive in "simple systems" with complex solutions, a singularity on math, several complex formulas, to_char's, decode's fest, but even with all this complex solutions you still have problems on operations, such thing points to a very common 'mistake', Transitivity and Versioning is not being lead to the persistence mechanism and look how interesting, suppose that you use a blog, to write your ideas, promote your superstars, and suppose that you write few words, and during your incredible abstraction about your superstars you are exercising Transitivity and Versioning all time, First Transitivity: Your phrases, thoughts are a graph "nouns and adjectives" can be considered Vertex and verbs the Edges, at same time, Versioning, the "Amazing blog team" built that button after some milliseconds saving a "draft for your". Nor a Pilsen, a Draft, soft, cold, at the point.
Now let's back again to "Operations", now suppose that you have the following scenario, you have "storage space", machine, and want 'to minimize your operational problems, 2 actions you can be taken, first thinking on transitivity, records on your database are not more "updated" on same "rowid", they are versioned, with this you release one thing called, lock on registry, but at same time you will have a 'little bit more complexity' managing concurrency on record, and will have to create "merge operations" and "dirty views" also concurrent updates, but look how amazing, all of this concepts or at least part of them are already implemented on a solution called ORM (Object Relational Mapping), and look how interesting, today there are at least dozens of well known, operationalized and tested on critical applications solutions running on "market standards", and with an advantage, such solutions remove from the 'SG-DBMS" team and machines all that computation transfering for another layer, called by some as "business layer", that not necessary shall run on another machine but certainly another process space with also "another grants and acl's" for such processes, at counter part you necessary don't need to do it, "old code shall execute while is on operation space", as some like to say, it is there, running, the count 1-1=0 so what do you have to worry about? Complexity, the antagonism of KISS principle complexity on certain aspects increase system robustness and decrease operational problems, but at counter part, when such paradigm is thought, another demon come to play, since you increase complexity and with this you have eyes to decrease operational problems another team shall born, agnostic and uncorrupted, the T Team, or Test Team (TIT), tit came from an expression "test integrate and test", which means, components are built, test separately then the solution is integrated and tested again, TIT rules.
Second now came "versioning", not code versioning but "record versioning", so your record now in future suffer from transitiveness and versioness decease, this shall noticed that will be implemented step by step, first on small projects, for the team gain the ability to manage merge and versioning records "ad hoc", like being a common practice that on "large projects" the team acquired the vision without barriers of lack on abstraction. And look how interesting, the datamodel will be more "permissible" because since records are not "touched" on origin (which means very rigid norms to manage directly records on production - paranoid factor at 10.0), but instead of this they are copied, promoting all transition and history, which means, the possibility to recover the path that "have originate the problem", just following the history of the record it self and not "the operational testimonial". Because the operational testimonial it self is covered by intrigues, conceit, self-conceit, presumption, resuming "increase the overall cost".
BTW the reader knows "what is a permissible datamodel", is known as the "DBA Dream", several DB constraints are dropped together with the triggers, such individuals went to the "business layer", tha database remains clean, just with records and for sure some "Metadata Describing the model it self",

15% OFF National Pen brand products at pens.com!