Interactive Pyaella using Dashi

Using dashi you can start an interactive session to query, browse, experiment and run tests... as well as full Create, Read, Update, Delete depending on your role set in the database.

Dashi doesn’t start an interactive shell in which to run by itself, so it must be started with the -i python argument.

To start a Dashi run, you can cd into a Pyaella project, making sure to have the correct PYTHONPATH exported, and run a command in a shell such as this:

python -i -m pyaella.orm.dashi -c mysite/app.yaml

The first argument -i puts us in interactive mode, so the Python interpreter will not exit at the end of its entry point. The -m argument sets the entry point of the interpreter, which in this case is Dashi. We provide Dashi a Pyaella application configuration file, which describes the application, where to find the Pyaella domain files for the ORM, etc..

Dashi will set up the environment, dynamically load the compiled Pyaella data models, inject the schemas, reflect anything specified, and set up the SQLAlchemySessionFactory.

Getting a Session

By calling get_session() you can get a new Session object, which is already attached to a connection in the connection pool, which is connected to your databased specified in the application configuration file supplied by the -c command line argument.

s = get_session()

This is a full SQAlchemy Session object and can perform all its normal and expected functions.

Working with logical models

Logical models and physical data models are different things. Logical models describe things in an abstract, decoupled way more suited to logical expression and execution. Physical ata models define how attributes and contraints are created and stored in persistence layers. (This is the Fisher Price explanation, of course.)

Here we define logical models as logical data models and data models as physical data models... an abbr per Data Models.

The logical models in Pyaella are dynamically created from metaclasses, using dependency injection from a domain or schema file. These models are accessible in Dashi by the dynamically created model namespace models. Any model described in the domain file is then accessible in this module.

asset = models.Asset

Pyaella models are logic objects, not pure data models, and so the underlying or canonical ORM state of a Pyaella model is interacted with using the tilde ~ operator. More of logic objects, state, and ORM capabilities of Pyaella object are described (here).

A simple query for an Asset

s = get_session()
q = s.query(~models.Asset).filter((~models.Asset).asset_id==6678036)
rp = q.all()
for asset in rp:
    print asset.asset_id

Here we get a session from the SQLAlchemySessionFactory using get_session(). Using this Session we create a query for Assets, filtering for Assets that have the asset_id 6678036.

When we run the query, asking for all() results, we receive a ResultProxy, which is a collection that emulates a python list, so we can iterate over the results.

Of course we could have run the query and asked for just the ‘first()’, and received a ResultProxy for just the one row, as the asset_id in this example is the primary key of this table.

A quick example of a single JOIN query coded in a typical Pyaella style, ‘prototyping’ the Pyaella model`s ORM mapping with the ~ canonical access operator for readability.

# get a session
s = get_session()

A, AL = ~models.Asset, ~models.AssetLoction

# create query, joining Asset with AssetLoction
rp = (s.query(A, AL)
        .filter(A.asset_id==6678036)).all()

# for each now in the result
# unpack the tuple row
for row in rp:
    asset, asset_location = row

Pyaella model characteristics, introspection

A Pyaella model has methods to help describe it to the world, namely to help work with its associated Table in a database if it has one, its columns and types, constraints.

It does this by working through the underlying SQAlchemy objects, and exposes them for directly business logic manipulation and expressions.

Getting the FieldDef object from an Asset object and printing its string representation would display the Column definition by SQLAlchemy. However, the FieldDef object has many methods of its own to work or express the fields of the Pyaella entity, which may or may not be directly associated to a database, table or view.

>>>fld_df = asset.field_def('asset_id')
Column(BIGINT, nullable=False, unique=None, primary_key=True)

Example of a Dashi run

Here we get a session from the SQLAlchemySessionFactory, and after getting the mapped class for an AssetLocation query for a row in the asset_locations table.

The object returned is a normal SQLAlchemy object, attached to the session, which is attached to its underlying connection pool.

We can create a Pyaella entity by passing the ORM object into the __init__, which allows us to get properties and attributes specified by the application’s domain.

>>> s = get_session()
>>> AL = ~models.AssetLocation
>>> rp = s.query(AL).filter(AL.asset_location_id==8517839).all()
>>> rp
[<pyaella.orm.AssetLocation object at 0x1057acc10>]
>>> al = models.AssetLocation(entity=rp[0])
>>> al
<saimin.models.AssetLocation object at 0x1057a9ed0>
>>> al.PrimaryKeyName
u'asset_location_id'

>>> for field in al.Fields:
...     field
...
u'asset_location_id'
u'path_id'
u'asset_id'
u'folder_from_path'
u'alt_filename'
u'initial_entry_by'
u'initial_entry_date'
u'last_upd'
u'last_uid'
u'last_operation'
u'qc_status_id'
u'archive_status'

Every Pyaella entity can be converted to a standard python dictionary using the to_dict() method.

>>> pprint.pprint(d)
{u'alt_filename': None,
 u'archive_status': u'ON',
 u'asset_id': 6677924L,
 u'asset_location_id': 8517839L,
 u'folder_from_path': u'_4/5000/4548/',
 u'initial_entry_by': u'mr_jobs_godzilla',
 u'initial_entry_date': datetime.datetime(2012, 9, 5, 13, 13, 1, 895192),
 u'last_operation': u'UPDATE',
 u'last_uid': u'mr_jobs_godzilla',
 u'last_upd': datetime.datetime(2012, 9, 22, 22, 31, 35, 720669),
 u'path_id': 3,
 u'qc_status_id': 4}

But that doesn’t necessarily mean that the data is ‘jsonable’, Pyaella’s default and preferred method of data representation. In this example the json modules throws an exception when attempting to serialize a python datetime object.

>>> json.dumps(d)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/__init__.py", line 243, in dumps
    return _default_encoder.encode(obj)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/encoder.py", line 207, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/encoder.py", line 270, in iterencode
    return _iterencode(o, 0)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/encoder.py", line 184, in default
    raise TypeError(repr(o) + " is not JSON serializable")
TypeError: datetime.datetime(2012, 9, 22, 22, 31, 35, 720669) is not JSON serializable

However, this is not an issue for most common object types supported by Pyaella. Each model also has a __json__ special method to serialize an entity appropriately. This method also takes a webob request object as an argument, but in this example we can pass a None.

>>> al.__json__
<bound method AssetLocation.__json__ of <saimin.models.AssetLocation object at 0x1057a9ed0>>
>>> al.__json__(None)
{u'asset_id': 6677924L, u'folder_from_path': u'_4/5000/4548/', u'last_operation': u'UPDATE', u'archive_status': u'ON', u'qc_status_id': 4, u'initial_entry_by': u'mr_jobs_godzilla', u'path_id': 3, u'last_upd': datetime.datetime(2012, 9, 22, 22, 31, 35, 720669), u'alt_filename': None, u'asset_location_id': 8517839L, u'initial_entry_date': datetime.datetime(2012, 9, 5, 13, 13, 1, 895192), u'last_uid': u'mr_jobs_godzilla'}