Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Lecture notes |
Pdf slides |
|
Creating an initial project repository at https://gitlab.mi.hdm-stuttgart.de.
Include your lecturer into your team providing at least read access.
Committing a Readme.md file describing
the project's goals.
This includes a precise description of the prototype's desired functionality likely to be extended along with your project's progress.
Identify individual tasks like e.g.:
Creating sample data.
Test server provisioning
Setting up test scenarios.
Selecting a documentation tool set.
Assign team members to tasks.
Implementation resulting in:
Version controlled source code (Gitlab, Github, ...)
End-user deployment description / CI/CD pipeline
End-user documentation.
Internal software documentation. (Architecture, design principles, frameworks ...)
Don't start documenting too late. The »Real programmers don't document, the code is obvious« myth no longer works!
Command line argument handling
Database access API's, e.g. JDBC™.
> grep --color ❶ -i ❷ fraction App.java package de.hdm_stuttgart.mi.sd1.fraction; * Playing with fraction objects. final Fraction threeSeven = new Fraction(3, 7); final Fraction
Database queries are more complex than matching text.
Support for Nosql databases, e.g. Mongodb.
Customer demands:
Restrict input record sets.
Filter / page output.
Connection profile handling
Search level specification:
|
|
Output formatting, limiting/filtering and paging
| Command | File ~/.dbgrep/Profiles/postgresTest.cfg |
|---|---|
dbgrep --profile postgresTest ... |
|
dbgrep ... ❶ dbgrep ... --table User --table Stocks ❷... dbgrep ... --column User.userId ❸ ... dbgrep ... --table Stocks --column User.userId ... ❹
dbgrep ... --equal 237 ❶ dbgrep ... --greater 4.43 ❷ dbgrep ... --like 'Smit%' ❸ dbgrep ... --like 'Smit%' --and --greater 4 ❹ dbgrep ... --range [-3:17] ❺
|
Search for integer values equal to 237. Return either of:
|
|
|
Search for numeric values being greater than 4.43. |
|
|
Texts starting with |
|
|
Conjunction: Records containing text starting with
|
|
|
Search for integer values between and including -3 and 17. |
Consider two potentially differing database systems e.g. Postgresql™ and Mysql™.
Hosting an active instance i.e. a set of tables containing data records and optionally views.
Yet empty or containing non-conflicting table and view names.
We assume full JDBC™ read access to our source database and full read/write access to the destination database.
Source to destination copy addressing vendor specific SQL syntax rules.
Transfer as many integrity constraints as possible:
|
|
null / not null defaults may
differ on both database systems.
The destination database may be a non-SQL database like Mongodb supporting a limited subset of schema constraints. A copy tool thereby supports database migration.
Useful technologies:
JDBC in general and ResultSetMetaData.html + friends in particular.
The tool may be implemented as a CLI application using a standard command line option handling parser. See CLI Comparison as well.
Precondition: Existing database schema.
Populate a corresponding database with test data.
Allow for configuration, e.g. table dependent record set sizes.
Consider an RDBMS with a given set of tables, data records and constraints. Software evolution requires schema evolution:
Adding new tables and views.
Adding / replacing data columns.
Changing types.
Adding / removing / changing integrity constraints.
Upgrading may involve:
Dumping the existing data to a series of JSON or XML files among with a database schema export.
Post modifying both exported data and schema to meet the desired version's schema.
In practice a database's size may effectively prohibit validation due to memory / performance limits.
Import data and integrity constraint to upgraded database.