Pluggable databases in apache karaf

classic Classic list List threaded Threaded
15 messages Options
Reply | Threaded
Open this post in threaded view
|

Pluggable databases in apache karaf

Steinar Bang
This blog post describes the way I switch between different database
systems (in my case: switch between derby and PostgreSQL) in apache
karaf:
 https://steinar.bang.priv.no/2019/10/26/pluggable-databases-for-apache-karaf-applications/

What I do, is:
 1. Create an application specific DatabaseService interface
 2. Inject the application specific DatabaseService into the business
    logic DS component
 3. Create an OSGi bundle defining a liquibase schema
 4. Create a DS component providing the DatabaseService that starts
    derby in-memory database and install the liquibase schema
 5. Create a DS component providing the DatabaseService that connects to
    PostgreSQL and install the liquibase schema
 6. Create different karaf features that
    1. load the application with the derby DS component
    2. load the application with the PostgreSQL component

More detail and examples in the blog post.

Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

jbonofre
Hi

Thanks for sharing, I will take a look.

The purpose is to have a service layer ? What's the difference with pax-jdbc and karaf JDBC feature ?

Regards
JB

Le 26 oct. 2019 13:21, Steinar Bang <[hidden email]> a écrit :

This blog post describes the way I switch between different database
systems (in my case: switch between derby and PostgreSQL) in apache
karaf:
https://steinar.bang.priv.no/2019/10/26/pluggable-databases-for-apache-karaf-applications/

What I do, is:
1. Create an application specific DatabaseService interface
2. Inject the application specific DatabaseService into the business
    logic DS component
3. Create an OSGi bundle defining a liquibase schema
4. Create a DS component providing the DatabaseService that starts
    derby in-memory database and install the liquibase schema
5. Create a DS component providing the DatabaseService that connects to
    PostgreSQL and install the liquibase schema
6. Create different karaf features that
    1. load the application with the derby DS component
    2. load the application with the PostgreSQL component

More detail and examples in the blog post.


Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

Steinar Bang
>>>>> [hidden email]:

> Hi
> Thanks for sharing, I will take a look.

> The purpose is to have a service layer ?

The purpose is to have a database that is ready to be used
(ie. connected and with a schema) by the business logic code, and to
make it easy to switch databases.

> What's the difference with pax-jdbc and karaf JDBC feature ?

It builds on top of them.

pax-jdbc provides a DataSourceFactory.

The components described in the blog post provides a DataSource.

The addition to pax-jdbc is actually connecting to the database and
using liquibase to set up/modify the schema and insert initial data.

Ie. my application specific database DS components use pax-jdbc (in the
case of derby) and the PostgreSQL driver to get the DataSourceFactory

When the application specific database DS components receive a
DataSourceFactory injection and they are activated, the first thing they
do is get a DataSource from the DataSourceFactory. This DataSource is
kept around while the DS component is active.

Before exposing any service, the application specific database DS
component will run liquibase scripts to set up/update the schema (and
add initial data), and when the scripts have run expose the application
specific database service.

Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

jbonofre
Hi,

OK, understood.

It's pretty close to Karaf JPA and JDBC examples right ?

Thanks anyway !

It's great to see different Karaf use cases !

Regards
JB

On 26/10/2019 16:08, Steinar Bang wrote:

>>>>>> [hidden email]:
>> Hi
>> Thanks for sharing, I will take a look.
>> The purpose is to have a service layer ?
> The purpose is to have a database that is ready to be used
> (ie. connected and with a schema) by the business logic code, and to
> make it easy to switch databases.
>
>> What's the difference with pax-jdbc and karaf JDBC feature ?
> It builds on top of them.
>
> pax-jdbc provides a DataSourceFactory.
>
> The components described in the blog post provides a DataSource.
>
> The addition to pax-jdbc is actually connecting to the database and
> using liquibase to set up/modify the schema and insert initial data.
>
> Ie. my application specific database DS components use pax-jdbc (in the
> case of derby) and the PostgreSQL driver to get the DataSourceFactory
>
> When the application specific database DS components receive a
> DataSourceFactory injection and they are activated, the first thing they
> do is get a DataSource from the DataSourceFactory. This DataSource is
> kept around while the DS component is active.
>
> Before exposing any service, the application specific database DS
> component will run liquibase scripts to set up/update the schema (and
> add initial data), and when the scripts have run expose the application
> specific database service.
>
Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

cschneider
In reply to this post by Steinar Bang
Hi Steinar,

you do not have to build your own layer to use liquibase. Pax-jdbc-config has a preHook that can help with this.

and 

The preHook attribute allows to select a PreHook service by name.
This service is called before the DataSource is published.

Christian
 

Am Sa., 26. Okt. 2019 um 16:06 Uhr schrieb Steinar Bang <[hidden email]>:
>>>>> [hidden email]:

> Hi
> Thanks for sharing, I will take a look.

> The purpose is to have a service layer ?

The purpose is to have a database that is ready to be used
(ie. connected and with a schema) by the business logic code, and to
make it easy to switch databases.

> What's the difference with pax-jdbc and karaf JDBC feature ?

It builds on top of them.

pax-jdbc provides a DataSourceFactory.

The components described in the blog post provides a DataSource.

The addition to pax-jdbc is actually connecting to the database and
using liquibase to set up/modify the schema and insert initial data.

Ie. my application specific database DS components use pax-jdbc (in the
case of derby) and the PostgreSQL driver to get the DataSourceFactory

When the application specific database DS components receive a
DataSourceFactory injection and they are activated, the first thing they
do is get a DataSource from the DataSourceFactory. This DataSource is
kept around while the DS component is active.

Before exposing any service, the application specific database DS
component will run liquibase scripts to set up/update the schema (and
add initial data), and when the scripts have run expose the application
specific database service.



--
--
Christian Schneider

http://www.liquid-reality.de

Computer Scientist

Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

jbonofre

Yup, I have the same comment.

By the way Christian, any plan to donate your examples/tutorial in pax-jdbc and/or karaf ?

It would be part of the itests and dev guide and valuable for the devs IMHO.

Just my $0.01 ;)

Regards
JB

On 26/10/2019 17:37, Christian Schneider wrote:
Hi Steinar,

you do not have to build your own layer to use liquibase. Pax-jdbc-config has a preHook that can help with this.

and 

The preHook attribute allows to select a PreHook service by name.
This service is called before the DataSource is published.

Christian
 

Am Sa., 26. Okt. 2019 um 16:06 Uhr schrieb Steinar Bang <[hidden email]>:
>>>>> [hidden email]:

> Hi
> Thanks for sharing, I will take a look.

> The purpose is to have a service layer ?

The purpose is to have a database that is ready to be used
(ie. connected and with a schema) by the business logic code, and to
make it easy to switch databases.

> What's the difference with pax-jdbc and karaf JDBC feature ?

It builds on top of them.

pax-jdbc provides a DataSourceFactory.

The components described in the blog post provides a DataSource.

The addition to pax-jdbc is actually connecting to the database and
using liquibase to set up/modify the schema and insert initial data.

Ie. my application specific database DS components use pax-jdbc (in the
case of derby) and the PostgreSQL driver to get the DataSourceFactory

When the application specific database DS components receive a
DataSourceFactory injection and they are activated, the first thing they
do is get a DataSource from the DataSourceFactory. This DataSource is
kept around while the DS component is active.

Before exposing any service, the application specific database DS
component will run liquibase scripts to set up/update the schema (and
add initial data), and when the scripts have run expose the application
specific database service.



--
--
Christian Schneider

http://www.liquid-reality.de

Computer Scientist

Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

Steinar Bang
In reply to this post by jbonofre
>>>>> Jean-Baptiste Onofré <[hidden email]>:

> It's pretty close to Karaf JPA and JDBC examples right ?

Not sure.  The JDBC example gets a DataSource and I'm unable to see
where the DataSource comes from?
 https://github.com/apache/karaf/blob/master/examples/karaf-jdbc-example/karaf-jdbc-example-provider/src/main/java/org/apache/karaf/examples/jdbc/provider/BookingServiceJdbcImpl.java#L74

Hm... it loads pax-jdbc-derby, but that would only give it a
DataSourceFactory...?
 https://github.com/apache/karaf/blob/master/examples/karaf-jdbc-example/karaf-jdbc-example-features/src/main/feature/feature.xml#L49

Something needs to take that DataSourceFactory and create a DataSource
for an actual database.  And I can't find that something in
 https://github.com/apache/karaf/tree/master/examples/karaf-jdbc-example

Hm... the first <config> element seems to contain the config necessary
to create a DataSource of an in-memory derby so it is possibly
karaf-jdbc-example-provider that creates the DataSource...?  But I'm
unable to see what it is in there that does the actual work?

So OK the differences:
 1. Going from DataSourceFactory to DataSource
   a. The JDBC example uses some magic I don't know about to get from a
      DataSourceFactory to a DataSource (something that I should know
      about perhaps, but don't), based on config in the feature
   b. The method of my blogpost introduces a DS component that listens for
      DataSourceFactory and provides a DataSource
 2. Schema setup
   a. The JDBC example has derby-specific table setup in the
      BookingServiceImpl class
       https://github.com/apache/karaf/blob/master/examples/karaf-jdbc-example/karaf-jdbc-example-provider/src/main/java/org/apache/karaf/examples/jdbc/provider/BookingServiceJdbcImpl.java#L46
   b. the DS component of the blog post uses liquibase to set up the
      schema in a database-independent manner
 3. Switching databases
   a. To swap a database the config magic in the feature will have to be changed
   b. The method in the blog post allows switching between databases by
      swapping the DS component for a different component

The core point of the blogpost is that the JDBC connection and setup is
lifted out of the business logic and into a pluggable component.  And
then I swap that component if I want to use PostgreSQL instead of derby.


The JPA example expects a JpaTemplate example
 https://github.com/apache/karaf/blob/master/examples/karaf-jpa-example/karaf-jpa-example-provider/karaf-jpa-example-provider-ds/karaf-jpa-example-provider-ds-eclipselink/src/main/java/org/apache/karaf/examples/jpa/provider/ds/eclipselink/BookingServiceImpl.java#L34
and doesn't use any JDBC stuff directly...? (I couldn't find the string
 "jdbc" in the example)

Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

Steinar Bang
In reply to this post by cschneider
>>>>> Christian Schneider <[hidden email]>:

> you do not have to build your own layer to use
> liquibase. Pax-jdbc-config has a preHook that can help with this.

> See
> https://github.com/cschneider/Karaf-Tutorial/blob/master/liquibase/service/src/main/java/net/lr/tutorial/db/service/Migrator.java
> and
> https://github.com/cschneider/Karaf-Tutorial/blob/master/liquibase/org.ops4j.datasource-person.cfg#L4


> The preHook attribute allows to select a PreHook service by name.
> This service is called before the DataSource is published.

Hm... interesting.  I didn't know about pax-jdbc-config.  Thanks for
sharing.

But I don't think the Migrator will be sufficient to handle all that I
do with liquibase.  Where does Migrator load the changesets file from?

Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

Steinar Bang
>>>>> Steinar Bang <[hidden email]>:

> But I don't think the Migrator will be sufficient to handle all that I
> do with liquibase.  Where does Migrator load the changesets file from?

By "where" I mean: from a file on disk? from the classpath? from
somewhere else?

Thanks!

Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

jbonofre
In reply to this post by Steinar Bang
Hi

I think you should take a look on pax-jdbc-config.

It can simplify a lot your stuff.

Regards
JB

Le 26 oct. 2019 19:46, Steinar Bang <[hidden email]> a écrit :

>>>>> Jean-Baptiste Onofré <[hidden email]>:

> It's pretty close to Karaf JPA and JDBC examples right ?

Not sure.  The JDBC example gets a DataSource and I'm unable to see
where the DataSource comes from?
https://github.com/apache/karaf/blob/master/examples/karaf-jdbc-example/karaf-jdbc-example-provider/src/main/java/org/apache/karaf/examples/jdbc/provider/BookingServiceJdbcImpl.java#L74

Hm... it loads pax-jdbc-derby, but that would only give it a
DataSourceFactory...?
https://github.com/apache/karaf/blob/master/examples/karaf-jdbc-example/karaf-jdbc-example-features/src/main/feature/feature.xml#L49

Something needs to take that DataSourceFactory and create a DataSource
for an actual database.  And I can't find that something in
https://github.com/apache/karaf/tree/master/examples/karaf-jdbc-example

Hm... the first <config> element seems to contain the config necessary
to create a DataSource of an in-memory derby so it is possibly
karaf-jdbc-example-provider that creates the DataSource...?  But I'm
unable to see what it is in there that does the actual work?

So OK the differences:
1. Going from DataSourceFactory to DataSource
   a. The JDBC example uses some magic I don't know about to get from a
      DataSourceFactory to a DataSource (something that I should know
      about perhaps, but don't), based on config in the feature
   b. The method of my blogpost introduces a DS component that listens for
      DataSourceFactory and provides a DataSource
2. Schema setup
   a. The JDBC example has derby-specific table setup in the
      BookingServiceImpl class
       https://github.com/apache/karaf/blob/master/examples/karaf-jdbc-example/karaf-jdbc-example-provider/src/main/java/org/apache/karaf/examples/jdbc/provider/BookingServiceJdbcImpl.java#L46
   b. the DS component of the blog post uses liquibase to set up the
      schema in a database-independent manner
3. Switching databases
   a. To swap a database the config magic in the feature will have to be changed
   b. The method in the blog post allows switching between databases by
      swapping the DS component for a different component

The core point of the blogpost is that the JDBC connection and setup is
lifted out of the business logic and into a pluggable component.  And
then I swap that component if I want to use PostgreSQL instead of derby.


The JPA example expects a JpaTemplate example
https://github.com/apache/karaf/blob/master/examples/karaf-jpa-example/karaf-jpa-example-provider/karaf-jpa-example-provider-ds/karaf-jpa-example-provider-ds-eclipselink/src/main/java/org/apache/karaf/examples/jpa/provider/ds/eclipselink/BookingServiceImpl.java#L34
and doesn't use any JDBC stuff directly...? (I couldn't find the string
"jdbc" in the example)


Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

Steinar Bang
In reply to this post by Steinar Bang
>>>>> Steinar Bang <[hidden email]>:

>> But I don't think the Migrator will be sufficient to handle all that I
>> do with liquibase.  Where does Migrator load the changesets file from?

> By "where" I mean: from a file on disk? from the classpath? from
> somewhere else?

From the classpath it looks like:
 https://github.com/cschneider/Karaf-Tutorial/tree/master/liquibase/service/src/main/resources/db

Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

Steinar Bang
In reply to this post by jbonofre
>>>>> [hidden email]:

> I think you should take a look on pax-jdbc-config.

> It can simplify a lot your stuff.

It will certainly simplify this project (and make it less bound to
postgresql), and I will try it there:
 https://github.com/steinarb/sonar-collector

But for the projects that prompted the blog post there probably won't be
much gain (most of the code is releated to liquibase and I would have
had to have 3 OSGi bundles anyway (one for the schema, one for test data
and one for initial data i a production base), the code to connect to a
base is about the same number of lines as the config, so it would boil
down to a preference between code and config (where I usually land on
code), and in addition the database service proved a nice place to put
the SQL statements that are different between derby and PostgreSQL)
 https://github.com/steinarb/authservice
 https://github.com/steinarb/ukelonn
 https://github.com/steinarb/handlereg

(changing to pax-jdbc-config *would* make it simpler to use different
databases and re-use the liquibase scripts, e.g. use h2 for the test
database and MySQL for the production base, but having that flexibility
isn't important to me for these projects)

Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

Steinar Bang
>>>>> Steinar Bang <[hidden email]>:

> It will certainly simplify this project (and make it less bound to
> postgresql), and I will try it there:
>  https://github.com/steinarb/sonar-collector

I've added pax-jdbc-config setup to the template feature.xml
 https://github.com/steinarb/scratch/blob/sonar-collector/use-pax-jdbc-config/sonar-collector-webhook/src/main/feature/feature.xml#L5

The DataSource is created (it seems) but not matched with the
@Reference.

The initial install fails with:
 Error executing command: Unable to resolve root: missing requirement [root] osgi.identity; osgi.identity=sonar-collector-webhook-with-postgresql; type=karaf.feature; version="[0,0.0.0]"; filter:="(&(osgi.identity=sonar-collector-webhook-with-postgresql)(type=karaf.feature)(version>=0.0.0)(version<=0.0.0))" [caused by: Unable to resolve sonar-collector-webhook-with-postgresql/0.0.0: missing requirement [sonar-collector-webhook-with-postgresql/0.0.0] osgi.identity; osgi.identity=sonar-collector-webhook; type=karaf.feature [caused by: Unable to resolve sonar-collector-webhook/1.0.1.SNAPSHOT: missing requirement [sonar-collector-webhook/1.0.1.SNAPSHOT] osgi.identity; osgi.identity=no.priv.bang.sonar.sonar-collector-webhook; type=osgi.bundle; version="[1.0.1.SNAPSHOT,1.0.1.SNAPSHOT]"; resolution:=mandatory [caused by: Unable to resolve no.priv.bang.sonar.sonar-collector-webhook/1.0.1.SNAPSHOT: missing requirement [no.priv.bang.sonar.sonar-collector-webhook/1.0.1.SNAPSHOT] osgi.service; effective:=active; filter:="(&(objectClass=javax.sql.DataSource)(osgi.jndi.service.name = jdbc/sonar-collector))"]]]

Command and error message here: https://gist.github.com/steinarb/f216c322d7428b4c9e835b6ca9a77168

I try installing the feature creating the DataSource service, and as far
as I can tell it creates the service and has the correct
osgi.jndi.service.name:
 https://gist.github.com/steinarb/402d4af50cd56014b8bf70e17897771c

At this point I know I have a DataSource service present with (as far as
I can tell) the correct value in the osgi.jndi.service.name, so I try
installing the feature that needs the service (ie. feature name
sonar-collector-webhook), but it fails with unable to match the
DataSource service:
 https://gist.github.com/steinarb/77db0de552f2bacf478a533baf1558c9

I tried to install the feature sonar-collector-webhook-with-postgresql
again, but got the same error as other two install attempts.

Any ideas?
 

Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

jbonofre
That's not a problem with the service itself, that's a missing requirement.

You can see how it works in Karaf jpa example: the capability is defined
in the features providing the datasource:

https://github.com/apache/karaf/blob/master/examples/karaf-jpa-example/karaf-jpa-example-features/src/main/feature/feature.xml#L28

So, nothing related to service itself at runtime, just cap/req matching.

Regards
JB

On 02/11/2019 16:09, Steinar Bang wrote:

>>>>>> Steinar Bang <[hidden email]>:
>
>> It will certainly simplify this project (and make it less bound to
>> postgresql), and I will try it there:
>>  https://github.com/steinarb/sonar-collector
>
> I've added pax-jdbc-config setup to the template feature.xml
>  https://github.com/steinarb/scratch/blob/sonar-collector/use-pax-jdbc-config/sonar-collector-webhook/src/main/feature/feature.xml#L5
>
> The DataSource is created (it seems) but not matched with the
> @Reference.
>
> The initial install fails with:
>  Error executing command: Unable to resolve root: missing requirement [root] osgi.identity; osgi.identity=sonar-collector-webhook-with-postgresql; type=karaf.feature; version="[0,0.0.0]"; filter:="(&(osgi.identity=sonar-collector-webhook-with-postgresql)(type=karaf.feature)(version>=0.0.0)(version<=0.0.0))" [caused by: Unable to resolve sonar-collector-webhook-with-postgresql/0.0.0: missing requirement [sonar-collector-webhook-with-postgresql/0.0.0] osgi.identity; osgi.identity=sonar-collector-webhook; type=karaf.feature [caused by: Unable to resolve sonar-collector-webhook/1.0.1.SNAPSHOT: missing requirement [sonar-collector-webhook/1.0.1.SNAPSHOT] osgi.identity; osgi.identity=no.priv.bang.sonar.sonar-collector-webhook; type=osgi.bundle; version="[1.0.1.SNAPSHOT,1.0.1.SNAPSHOT]"; resolution:=mandatory [caused by: Unable to resolve no.priv.bang.sonar.sonar-collector-webhook/1.0.1.SNAPSHOT: missing requirement [no.priv.bang.sonar.sonar-collector-webhook/1.0.1.SNAPSHOT] osgi.service; effective:=active; filter:="(&(objectClass=javax.sql.DataSource)(osgi.jndi.service.name = jdbc/sonar-collector))"]]]
>
> Command and error message here: https://gist.github.com/steinarb/f216c322d7428b4c9e835b6ca9a77168
>
> I try installing the feature creating the DataSource service, and as far
> as I can tell it creates the service and has the correct
> osgi.jndi.service.name:
>  https://gist.github.com/steinarb/402d4af50cd56014b8bf70e17897771c
>
> At this point I know I have a DataSource service present with (as far as
> I can tell) the correct value in the osgi.jndi.service.name, so I try
> installing the feature that needs the service (ie. feature name
> sonar-collector-webhook), but it fails with unable to match the
> DataSource service:
>  https://gist.github.com/steinarb/77db0de552f2bacf478a533baf1558c9
>
> I tried to install the feature sonar-collector-webhook-with-postgresql
> again, but got the same error as other two install attempts.
>
> Any ideas?
>  
>

--
Jean-Baptiste Onofré
[hidden email]
http://blog.nanthrax.net
Talend - http://www.talend.com
Reply | Threaded
Open this post in threaded view
|

Re: Pluggable databases in apache karaf

Steinar Bang
>>>>> Jean-Baptiste Onofré <[hidden email]>:

> That's not a problem with the service itself, that's a missing requirement.
> You can see how it works in Karaf jpa example: the capability is defined
> in the features providing the datasource:

> https://github.com/apache/karaf/blob/master/examples/karaf-jpa-example/karaf-jpa-example-features/src/main/feature/feature.xml#L28

> So, nothing related to service itself at runtime, just cap/req matching.

Thanks, JB! Now it works:
 https://github.com/steinarb/sonar-collector/blob/master/sonar-collector-webhook/src/main/feature/feature.xml#L13