Using custom log4j appenders under Karaf 2.1.4

classic Classic list List threaded Threaded
15 messages Options
Reply | Threaded
Open this post in threaded view
|

Using custom log4j appenders under Karaf 2.1.4

mgardiner
Hi,

Are there any examples available showing how to use custom log4j appenders under Karaf?  I see in the users guide the following note:

"If you plan to use your own appenders, you need to create an OSGi bundle and attach it as a fragment to the bundle with a symbolic name of org.ops4j.pax.logging.pax-logging-service. This way, the underlying logging system will be able to see and use your appenders."

We have a custom remoting logging appender we wish to utilize with our project hosted in Karaf 2.1.4.

I am assuming we turn our appenders jar into a bundle with the fragment host set to org.ops4j.pax.logging.pax-logging-service and deployed with our project.  Is that correct?

How do you recommend handling multiple log4j.xml files for each deployment environment such as development, staging, and production.

Thanks.

-Mike-
Reply | Threaded
Open this post in threaded view
|

Re: Using custom log4j appenders under Karaf 2.1.4

Guillaume Nodet
On Tue, Apr 5, 2011 at 23:43, mgardiner <[hidden email]> wrote:

> Hi,
>
> Are there any examples available showing how to use custom log4j appenders
> under Karaf?  I see in the users guide the following note:
>
> "If you plan to use your own appenders, you need to create an OSGi bundle
> and attach it as a fragment to the bundle with a symbolic name of
> org.ops4j.pax.logging.pax-logging-service. This way, the underlying logging
> system will be able to see and use your appenders."
>
> We have a custom remoting logging appender we wish to utilize with our
> project hosted in Karaf 2.1.4.
>
> I am assuming we turn our appenders jar into a bundle with the fragment host
> set to org.ops4j.pax.logging.pax-logging-service and deployed with our
> project.  Is that correct?

Yep, that's correct.

>
> How do you recommend handling multiple log4j.xml files for each deployment
> environment such as development, staging, and production.
>

Good question.  We don't usually use xml files for log4j but rather
the configuration based ones.
Take a look at the etc/org.ops4j.pax.logging.cfg file which is
actually a log4j config file.
That file will be used to configure log4j, so you should only touch
that file and configure it
differently between your environments I think;

> Thanks.
>
> -Mike-
>
> --
> View this message in context: http://karaf.922171.n3.nabble.com/Using-custom-log4j-appenders-under-Karaf-2-1-4-tp2781811p2781811.html
> Sent from the Karaf - User mailing list archive at Nabble.com.
>



--
Cheers,
Guillaume Nodet
------------------------
Blog: http://gnodet.blogspot.com/
------------------------
Open Source SOA
http://fusesource.com
Reply | Threaded
Open this post in threaded view
|

Re: Using custom log4j appenders under Karaf 2.1.4

mikevan
Guillaume Nodet wrote
On Tue, Apr 5, 2011 at 23:43, mgardiner <[hidden email]> wrote:
> Hi,
>
> Are there any examples available showing how to use custom log4j appenders
> under Karaf?  I see in the users guide the following note:
>
> "If you plan to use your own appenders, you need to create an OSGi bundle
> and attach it as a fragment to the bundle with a symbolic name of
> org.ops4j.pax.logging.pax-logging-service. This way, the underlying logging
> system will be able to see and use your appenders."
>
> We have a custom remoting logging appender we wish to utilize with our
> project hosted in Karaf 2.1.4.
>
> I am assuming we turn our appenders jar into a bundle with the fragment host
> set to org.ops4j.pax.logging.pax-logging-service and deployed with our
> project.  Is that correct?

Yep, that's correct.

>
> How do you recommend handling multiple log4j.xml files for each deployment
> environment such as development, staging, and production.
>

Good question.  We don't usually use xml files for log4j but rather
the configuration based ones.
Take a look at the etc/org.ops4j.pax.logging.cfg file which is
actually a log4j config file.
That file will be used to configure log4j, so you should only touch
that file and configure it
differently between your environments I think;

> Thanks.
>
> -Mike-
>
> --
> View this message in context: http://karaf.922171.n3.nabble.com/Using-custom-log4j-appenders-under-Karaf-2-1-4-tp2781811p2781811.html
> Sent from the Karaf - User mailing list archive at Nabble.com.
>



--
Cheers,
Guillaume Nodet
------------------------
Blog: http://gnodet.blogspot.com/
------------------------
Open Source SOA
http://fusesource.com
A question came from one of my developers on this issue today.  In our application, we have a log4j.xml file for each of our bundles.  When run outside of OSGi, these xml files create specific log files for each bundle.  When run inside of Karaf, it appears all of these logs are aggregated into karaf.log.  So, two questions:
1) is this the expected behaviour?
2) is there an example of  how to get log messages written into different log files?

Reply | Threaded
Open this post in threaded view
|

Re: Using custom log4j appenders under Karaf 2.1.4

Guillaume Nodet
On Wed, Apr 6, 2011 at 19:00, mikevan <[hidden email]> wrote:

>
> Guillaume Nodet wrote:
>>
>> On Tue, Apr 5, 2011 at 23:43, mgardiner &lt;[hidden email]&gt;
>> wrote:
>>> Hi,
>>>
>>> Are there any examples available showing how to use custom log4j
>>> appenders
>>> under Karaf?  I see in the users guide the following note:
>>>
>>> "If you plan to use your own appenders, you need to create an OSGi bundle
>>> and attach it as a fragment to the bundle with a symbolic name of
>>> org.ops4j.pax.logging.pax-logging-service. This way, the underlying
>>> logging
>>> system will be able to see and use your appenders."
>>>
>>> We have a custom remoting logging appender we wish to utilize with our
>>> project hosted in Karaf 2.1.4.
>>>
>>> I am assuming we turn our appenders jar into a bundle with the fragment
>>> host
>>> set to org.ops4j.pax.logging.pax-logging-service and deployed with our
>>> project.  Is that correct?
>>
>> Yep, that's correct.
>>
>>>
>>> How do you recommend handling multiple log4j.xml files for each
>>> deployment
>>> environment such as development, staging, and production.
>>>
>>
>> Good question.  We don't usually use xml files for log4j but rather
>> the configuration based ones.
>> Take a look at the etc/org.ops4j.pax.logging.cfg file which is
>> actually a log4j config file.
>> That file will be used to configure log4j, so you should only touch
>> that file and configure it
>> differently between your environments I think;
>>
>>> Thanks.
>>>
>>> -Mike-
>>>
>>> --
>>> View this message in context:
>>> http://karaf.922171.n3.nabble.com/Using-custom-log4j-appenders-under-Karaf-2-1-4-tp2781811p2781811.html
>>> Sent from the Karaf - User mailing list archive at Nabble.com.
>>>
>>
>>
>>
>> --
>> Cheers,
>> Guillaume Nodet
>> ------------------------
>> Blog: http://gnodet.blogspot.com/
>> ------------------------
>> Open Source SOA
>> http://fusesource.com
>>
>
> A question came from one of my developers on this issue today.  In our
> application, we have a log4j.xml file for each of our bundles.  When run
> outside of OSGi, these xml files create specific log files for each bundle.
> When run inside of Karaf, it appears all of these logs are aggregated into
> karaf.log.  So, two questions:
> 1) is this the expected behaviour?

Yes

> 2) is there an example of  how to get log messages written into different
> log files?

Yes :-)
See http://karaf.apache.org/manual/2.2.0/users-guide/logging-system.html#OSGispecificMDCattributes

>
>
> -----
> Mike Van (aka karafman)
> Karaf Team (Contributor)
> --
> View this message in context: http://karaf.922171.n3.nabble.com/Using-custom-log4j-appenders-under-Karaf-2-1-4-tp2781811p2785901.html
> Sent from the Karaf - User mailing list archive at Nabble.com.
>



--
Cheers,
Guillaume Nodet
------------------------
Blog: http://gnodet.blogspot.com/
------------------------
Open Source SOA
http://fusesource.com
Reply | Threaded
Open this post in threaded view
|

Re: Using custom log4j appenders under Karaf 2.1.4

mikevan
Guillaume Nodet wrote
On Wed, Apr 6, 2011 at 19:00, mikevan <[hidden email]> wrote:
>
> Guillaume Nodet wrote:
>>
>> On Tue, Apr 5, 2011 at 23:43, mgardiner <[hidden email]>
>> wrote:
>>> Hi,
>>>
>>> Are there any examples available showing how to use custom log4j
>>> appenders
>>> under Karaf?  I see in the users guide the following note:
>>>
>>> "If you plan to use your own appenders, you need to create an OSGi bundle
>>> and attach it as a fragment to the bundle with a symbolic name of
>>> org.ops4j.pax.logging.pax-logging-service. This way, the underlying
>>> logging
>>> system will be able to see and use your appenders."
>>>
>>> We have a custom remoting logging appender we wish to utilize with our
>>> project hosted in Karaf 2.1.4.
>>>
>>> I am assuming we turn our appenders jar into a bundle with the fragment
>>> host
>>> set to org.ops4j.pax.logging.pax-logging-service and deployed with our
>>> project.  Is that correct?
>>
>> Yep, that's correct.
>>
>>>
>>> How do you recommend handling multiple log4j.xml files for each
>>> deployment
>>> environment such as development, staging, and production.
>>>
>>
>> Good question.  We don't usually use xml files for log4j but rather
>> the configuration based ones.
>> Take a look at the etc/org.ops4j.pax.logging.cfg file which is
>> actually a log4j config file.
>> That file will be used to configure log4j, so you should only touch
>> that file and configure it
>> differently between your environments I think;
>>
>>> Thanks.
>>>
>>> -Mike-
>>>
>>> --
>>> View this message in context:
>>> http://karaf.922171.n3.nabble.com/Using-custom-log4j-appenders-under-Karaf-2-1-4-tp2781811p2781811.html
>>> Sent from the Karaf - User mailing list archive at Nabble.com.
>>>
>>
>>
>>
>> --
>> Cheers,
>> Guillaume Nodet
>> ------------------------
>> Blog: http://gnodet.blogspot.com/
>> ------------------------
>> Open Source SOA
>> http://fusesource.com
>>
>
> A question came from one of my developers on this issue today.  In our
> application, we have a log4j.xml file for each of our bundles.  When run
> outside of OSGi, these xml files create specific log files for each bundle.
> When run inside of Karaf, it appears all of these logs are aggregated into
> karaf.log.  So, two questions:
> 1) is this the expected behaviour?

Yes

> 2) is there an example of  how to get log messages written into different
> log files?

Yes :-)
See http://karaf.apache.org/manual/2.2.0/users-guide/logging-system.html#OSGispecificMDCattributes

>
>
> -----
> Mike Van (aka karafman)
> Karaf Team (Contributor)
> --
> View this message in context: http://karaf.922171.n3.nabble.com/Using-custom-log4j-appenders-under-Karaf-2-1-4-tp2781811p2785901.html
> Sent from the Karaf - User mailing list archive at Nabble.com.
>



--
Cheers,
Guillaume Nodet
------------------------
Blog: http://gnodet.blogspot.com/
------------------------
Open Source SOA
http://fusesource.com
Guillaume,

Is there a way to configure Sift to log the output of a specific bundle?  I have a bundle that creates a number of log messages, but these are being split between a number of log files when using the default configuration of sift.  What would benefit my team is the ability to see all of the log messages from a specific bundle in one log file.  

Along these lines, is there a good example of a fragment that aggregates logging from a number of bundles into a single log-file?  I know that can be done, but I'm having difficulty figuring out how.
Reply | Threaded
Open this post in threaded view
|

Re: Using custom log4j appenders under Karaf 2.1.4

Guillaume Nodet
If you have one bundle, can't you just configure the logger for that
bundle with a different appender ?

log4j.logger.the-logger-for-your-bundle = customappender
log4j.appender.customappender = xxx


On Fri, Apr 8, 2011 at 16:31, mikevan <[hidden email]> wrote:

>
> Guillaume Nodet wrote:
>>
>> On Wed, Apr 6, 2011 at 19:00, mikevan &lt;[hidden email]&gt;
>> wrote:
>>>
>>> Guillaume Nodet wrote:
>>>>
>>>> On Tue, Apr 5, 2011 at 23:43, mgardiner
>>>> &lt;[hidden email]&gt;
>>>> wrote:
>>>>> Hi,
>>>>>
>>>>> Are there any examples available showing how to use custom log4j
>>>>> appenders
>>>>> under Karaf?  I see in the users guide the following note:
>>>>>
>>>>> "If you plan to use your own appenders, you need to create an OSGi
>>>>> bundle
>>>>> and attach it as a fragment to the bundle with a symbolic name of
>>>>> org.ops4j.pax.logging.pax-logging-service. This way, the underlying
>>>>> logging
>>>>> system will be able to see and use your appenders."
>>>>>
>>>>> We have a custom remoting logging appender we wish to utilize with our
>>>>> project hosted in Karaf 2.1.4.
>>>>>
>>>>> I am assuming we turn our appenders jar into a bundle with the fragment
>>>>> host
>>>>> set to org.ops4j.pax.logging.pax-logging-service and deployed with our
>>>>> project.  Is that correct?
>>>>
>>>> Yep, that's correct.
>>>>
>>>>>
>>>>> How do you recommend handling multiple log4j.xml files for each
>>>>> deployment
>>>>> environment such as development, staging, and production.
>>>>>
>>>>
>>>> Good question.  We don't usually use xml files for log4j but rather
>>>> the configuration based ones.
>>>> Take a look at the etc/org.ops4j.pax.logging.cfg file which is
>>>> actually a log4j config file.
>>>> That file will be used to configure log4j, so you should only touch
>>>> that file and configure it
>>>> differently between your environments I think;
>>>>
>>>>> Thanks.
>>>>>
>>>>> -Mike-
>>>>>
>>>>> --
>>>>> View this message in context:
>>>>> http://karaf.922171.n3.nabble.com/Using-custom-log4j-appenders-under-Karaf-2-1-4-tp2781811p2781811.html
>>>>> Sent from the Karaf - User mailing list archive at Nabble.com.
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Cheers,
>>>> Guillaume Nodet
>>>> ------------------------
>>>> Blog: http://gnodet.blogspot.com/
>>>> ------------------------
>>>> Open Source SOA
>>>> http://fusesource.com
>>>>
>>>
>>> A question came from one of my developers on this issue today.  In our
>>> application, we have a log4j.xml file for each of our bundles.  When run
>>> outside of OSGi, these xml files create specific log files for each
>>> bundle.
>>> When run inside of Karaf, it appears all of these logs are aggregated
>>> into
>>> karaf.log.  So, two questions:
>>> 1) is this the expected behaviour?
>>
>> Yes
>>
>>> 2) is there an example of  how to get log messages written into different
>>> log files?
>>
>> Yes :-)
>> See
>> http://karaf.apache.org/manual/2.2.0/users-guide/logging-system.html#OSGispecificMDCattributes
>>
>>>
>>>
>>> -----
>>> Mike Van (aka karafman)
>>> Karaf Team (Contributor)
>>> --
>>> View this message in context:
>>> http://karaf.922171.n3.nabble.com/Using-custom-log4j-appenders-under-Karaf-2-1-4-tp2781811p2785901.html
>>> Sent from the Karaf - User mailing list archive at Nabble.com.
>>>
>>
>>
>>
>> --
>> Cheers,
>> Guillaume Nodet
>> ------------------------
>> Blog: http://gnodet.blogspot.com/
>> ------------------------
>> Open Source SOA
>> http://fusesource.com
>>
>
> Guillaume,
>
> Is there a way to configure Sift to log the output of a specific bundle?  I
> have a bundle that creates a number of log messages, but these are being
> split between a number of log files when using the default configuration of
> sift.  What would benefit my team is the ability to see all of the log
> messages from a specific bundle in one log file.
>
> Along these lines, is there a good example of a fragment that aggregates
> logging from a number of bundles into a single log-file?  I know that can be
> done, but I'm having difficulty figuring out how.
>
> -----
> Mike Van (aka karafman)
> Karaf Team (Contributor)
> --
> View this message in context: http://karaf.922171.n3.nabble.com/Using-custom-log4j-appenders-under-Karaf-2-1-4-tp2781811p2795467.html
> Sent from the Karaf - User mailing list archive at Nabble.com.
>



--
Cheers,
Guillaume Nodet
------------------------
Blog: http://gnodet.blogspot.com/
------------------------
Open Source SOA
http://fusesource.com
Reply | Threaded
Open this post in threaded view
|

Re: Using custom log4j appenders under Karaf 2.1.4

Echo
This post has NOT been accepted by the mailing list yet.
In reply to this post by mikevan
@ mikevan:
regarding ur 2nd question :
 2) is there an example of  how to get log messages written into different log files?

Did u manage it as um now messing around the file but I couldn't find out a solution yet
Reply | Threaded
Open this post in threaded view
|

Re: Using custom log4j appenders under Karaf 2.1.4

mikevan
Echo,

Yes, but instead of making you google it, I'll just tell you how its done. :-)

First, its good to remember that in log4j, logging can be set on a package level.  This can be done in Karaf using the following console command

karaf@root> log:set <level> mypackage.subpackage

This will set the logging for that specific package to <level>.

In Karaf 3.x, you can also filter for a specific package using the following command:

karaf@root> log:get mypackage.subpackage

Some caveats with the log console commands.  With the exception of log:set, your commands won't affect the logs placed on the file system. For example, if you type log:clear, you won't have access to any log messages written out prior to executing the log:clear command.  However, log:clear won't remove those older log messages from the log directory. This is because all of the logging commands don't actually work against your log files.  Instead, the logging commands look for PaxLoggingEvents inside of Karaf.  So, if you accidentally run log:clear, dont' worry, you won't lose all of your log messages.  However, if you accidentally set your logging level to DEBUG or TRACE, this change will result in all of your logging messages being set to the new level and your logs filling up very quickly.

So, the log messages are great but they wont' get you what you want, which is a file containing only the messages from a given package.  To do that, you simply create a logger for your package, and then create a RollingFileAppender for that logger.  Here's an example of what to place in your org.ops4j.pax.logging.cfg file.

# mypackage.subpackage appenderlog4j.logger.mypackage.subpackage= <level>, subpackage
log4j.appender.subpackage=org.apache.log4j.RollingFileAppender
log4j.appender.subpackage.layout=org.apache.log4j.PatternLayout
log4j.appender.subpackage.layout.ConversionPattern=<pattern>
log4j.appender.subpackage.file=${karaf.data}/log/subpackage.log
log4j.appender.subpackage.append=true
log4j.appender.subpackage.maxFileSize=10MB
log4j.appender.subpackage.maxBackupIndex=10

Now, some caveats.  If your "subpackage" has camel routes that you'd like logged into your subpackage log file, they won't appear. This is because the messages from your camel routes are generated from org.apache.camel, and not by your subpackage.  I dont' really know a way to get those camel message froms subpackage to appear in your subpackage log file.  Also, all messages are still going to be written into your karaf.log file. So, if you are seeing some strangeness and can't diagnose it in your subpackage.log file, check out your karaf.log.

Please let me know if this answers your question.

Echo wrote
@ mikevan:
regarding ur 2nd question :
 2) is there an example of  how to get log messages written into different log files?

Did u manage it as um now messing around the file but I couldn't find out a solution yet
Reply | Threaded
Open this post in threaded view
|

Re: Using custom log4j appenders under Karaf 2.1.4

mikevan
Oops!  Looks like there was a typo. Try this instead:
# mypackage.subpackage appender
log4j.logger.mypackage.subpackage=<level>, subpackage
log4j.appender.subpackage=org.apache.log4j.RollingFileAppender
log4j.appender.subpackage.layout=org.apache.log4j.PatternLayout
log4j.appender.subpackage.layout.ConversionPattern=<pattern>
log4j.appender.subpackage.file=${karaf.data}/log/subpackage.log
log4j.appender.subpackage.append=true
log4j.appender.subpackage.maxFileSize=10MB
log4j.appender.subpackage.maxBackupIndex=10



mikevan wrote
Echo,

Yes, but instead of making you google it, I'll just tell you how its done. :-)

First, its good to remember that in log4j, logging can be set on a package level.  This can be done in Karaf using the following console command

karaf@root> log:set <level> mypackage.subpackage

This will set the logging for that specific package to <level>.

In Karaf 3.x, you can also filter for a specific package using the following command:

karaf@root> log:get mypackage.subpackage

Some caveats with the log console commands.  With the exception of log:set, your commands won't affect the logs placed on the file system. For example, if you type log:clear, you won't have access to any log messages written out prior to executing the log:clear command.  However, log:clear won't remove those older log messages from the log directory. This is because all of the logging commands don't actually work against your log files.  Instead, the logging commands look for PaxLoggingEvents inside of Karaf.  So, if you accidentally run log:clear, dont' worry, you won't lose all of your log messages.  However, if you accidentally set your logging level to DEBUG or TRACE, this change will result in all of your logging messages being set to the new level and your logs filling up very quickly.

So, the log messages are great but they wont' get you what you want, which is a file containing only the messages from a given package.  To do that, you simply create a logger for your package, and then create a RollingFileAppender for that logger.  Here's an example of what to place in your org.ops4j.pax.logging.cfg file.

# mypackage.subpackage appenderlog4j.logger.mypackage.subpackage= <level>, subpackage
log4j.appender.subpackage=org.apache.log4j.RollingFileAppender
log4j.appender.subpackage.layout=org.apache.log4j.PatternLayout
log4j.appender.subpackage.layout.ConversionPattern=<pattern>
log4j.appender.subpackage.file=${karaf.data}/log/subpackage.log
log4j.appender.subpackage.append=true
log4j.appender.subpackage.maxFileSize=10MB
log4j.appender.subpackage.maxBackupIndex=10

Now, some caveats.  If your "subpackage" has camel routes that you'd like logged into your subpackage log file, they won't appear. This is because the messages from your camel routes are generated from org.apache.camel, and not by your subpackage.  I dont' really know a way to get those camel message froms subpackage to appear in your subpackage log file.  Also, all messages are still going to be written into your karaf.log file. So, if you are seeing some strangeness and can't diagnose it in your subpackage.log file, check out your karaf.log.

Please let me know if this answers your question.

Echo wrote
@ mikevan:
regarding ur 2nd question :
 2) is there an example of  how to get log messages written into different log files?

Did u manage it as um now messing around the file but I couldn't find out a solution yet
Reply | Threaded
Open this post in threaded view
|

Re: Using custom log4j appenders under Karaf 2.1.4

Echo
This post has NOT been accepted by the mailing list yet.
Actually , I have solved it but I still one step far from my goal .

That is "org.ops4j.pax.logging.cfg" after updating it :


################################################################################
#
#    Licensed to the Apache Software Foundation (ASF) under one or more
#    contributor license agreements.  See the NOTICE file distributed with
#    this work for additional information regarding copyright ownership.
#    The ASF licenses this file to You under the Apache License, Version 2.0
#    (the "License"); you may not use this file except in compliance with
#    the License.  You may obtain a copy of the License at
#
#       http://www.apache.org/licenses/LICENSE-2.0
#
#    Unless required by applicable law or agreed to in writing, software
#    distributed under the License is distributed on an "AS IS" BASIS,
#    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#    See the License for the specific language governing permissions and
#    limitations under the License.
#
################################################################################

# Root logger
log4j.rootLogger=INFO, out,osgi:VmLogAppender
log4j.logger.new=INFO,new
log4j.throwableRenderer=org.apache.log4j.OsgiThrowableRenderer

# CONSOLE appender not used by default
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{ABSOLUTE} | %-5.5p | %-16.16t | %-32.32c{1} | %-32.32C %4L | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n



# New File appender
#log4j.rootLogger=INFO,new
log4j.appender.new=org.apache.log4j.RollingFileAppender
log4j.appender.new.layout=org.apache.log4j.PatternLayout
log4j.appender.new.layout.ConversionPattern=%d{ABSOLUTE} | %-5.5p | %-16.16t | %-32.32c{1} | %-32.32C %4L | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n
log4j.appender.new.file=${karaf.data}/log/new.log
log4j.appender.new.append=true
log4j.appender.new.maxFileSize=1MB
log4j.appender.new.maxBackupIndex=10

# File appender
log4j.appender.out=org.apache.log4j.RollingFileAppender
log4j.appender.out.layout=org.apache.log4j.PatternLayout
log4j.appender.out.layout.ConversionPattern=%d{ABSOLUTE} | %-5.5p | %-16.16t | %-32.32c{1} | %-32.32C %4L | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n
log4j.appender.out.file=${karaf.data}/log/karaf.log
log4j.appender.out.append=true
log4j.appender.out.maxFileSize=1MB
log4j.appender.out.maxBackupIndex=10



# Sift appender
log4j.appender.sift=org.apache.log4j.sift.MDCSiftingAppender
log4j.appender.sift.key=bundle.name
log4j.appender.sift.default=karaf
log4j.appender.sift.appender=org.apache.log4j.FileAppender
log4j.appender.sift.appender.layout=org.apache.log4j.PatternLayout
log4j.appender.sift.appender.layout.ConversionPattern=%d{ABSOLUTE} | %-5.5p | %-16.16t | %-32.32c{1} | %-32.32C %4L | %m%n
log4j.appender.sift.appender.file=${karaf.data}/log/$\\{bundle.name\\}.log
log4j.appender.sift.appender.append=true


Now it logs into karaf everything .

I just need to log into "new.log" into a specific processor into a camel like the  following  :

process(doSmth).log(LoggingLevel.INFO,"new","Hello New  Logger")

JUST in that case to go and log into the new.log

@ "new.log" , it successfully logs but the log doesn't have the stack trace like what it's been already logged into karaf.log

That's what has been logged into "new.log"

20:41:07,352 | INFO  | qtp3901595-74    | new | rg.apache.camel.processor.Logger  213 | 51 - org.apache.camel.camel-core - 2.6.0 | Hello NewLogger
Reply | Threaded
Open this post in threaded view
|

Re: Using custom log4j appenders under Karaf 2.1.4

Echo
This post has NOT been accepted by the mailing list yet.
The solution for my problem is :

1- u should update "org.ops4j.pax.logging.cfg" and add a new file appender .
In my case my new file appender is "new"


    ################################################################################
    #
    #    Licensed to the Apache Software Foundation (ASF) under one or more
    #    contributor license agreements.  See the NOTICE file distributed with
    #    this work for additional information regarding copyright ownership.
    #    The ASF licenses this file to You under the Apache License, Version 2.0
    #    (the "License"); you may not use this file except in compliance with
    #    the License.  You may obtain a copy of the License at
    #
    #       http://www.apache.org/licenses/LICENSE-2.0
    #
    #    Unless required by applicable law or agreed to in writing, software
    #    distributed under the License is distributed on an "AS IS" BASIS,
    #    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    #    See the License for the specific language governing permissions and
    #    limitations under the License.
    #
    ################################################################################
   
    # Root logger
    log4j.rootLogger=INFO, out,osgi:VmLogAppender
    log4j.logger.new=INFO,new
    log4j.throwableRenderer=org.apache.log4j.OsgiThrowableRenderer
   
    # CONSOLE appender not used by default
    log4j.appender.stdout=org.apache.log4j.ConsoleAppender
    log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
    log4j.appender.stdout.layout.ConversionPattern=%d{ABSOLUTE} | %-5.5p | %-16.16t | %-32.32c{1} | %-32.32C %4L | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n
   
   
   
    # new File appender
    log4j.appender.new=org.apache.log4j.RollingFileAppender
    log4j.appender.new.layout=org.apache.log4j.PatternLayout
    log4j.appender.new.layout.ConversionPattern=%d{ABSOLUTE} | %-5.5p | %-16.16t | %-32.32c{1} | %-32.32C %4L | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n
    log4j.appender.new.file=${karaf.data}/log/new.log
    log4j.appender.new.append=true
    log4j.appender.new.maxFileSize=1MB
    log4j.appender.new.maxBackupIndex=10
   
    # File appender
    log4j.appender.out=org.apache.log4j.RollingFileAppender
    log4j.appender.out.layout=org.apache.log4j.PatternLayout
    log4j.appender.out.layout.ConversionPattern=%d{ABSOLUTE} | %-5.5p | %-16.16t | %-32.32c{1} | %-32.32C %4L | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n
    log4j.appender.out.file=${karaf.data}/log/karaf.log
    log4j.appender.out.append=true
    log4j.appender.out.maxFileSize=1MB
    log4j.appender.out.maxBackupIndex=10
   
   
   
    # Sift appender
    log4j.appender.sift=org.apache.log4j.sift.MDCSiftingAppender
    log4j.appender.sift.key=bundle.name
    log4j.appender.sift.default=karaf
    log4j.appender.sift.appender=org.apache.log4j.FileAppender
    log4j.appender.sift.appender.layout=org.apache.log4j.PatternLayout
    log4j.appender.sift.appender.layout.ConversionPattern=%d{ABSOLUTE} | %-5.5p | %-16.16t | %-32.32c{1} | %-32.32C %4L | %m%n
    log4j.appender.sift.appender.file=${karaf.data}/log/$\\{bundle.name\\}.log
    log4j.appender.sift.appender.append=true


2- I use DSL to log :

    process(doSmth).log(LoggingLevel.INFO,"new","HelloZ my new Logger ${exception.stacktrace}")



Enjoy :)
Reply | Threaded
Open this post in threaded view
|

Re: Using custom log4j appenders under Karaf 2.1.4

apadki
This post has NOT been accepted by the mailing list yet.
Hello,

Thanks for the post, it was very useful. I am stuck now at everything showing in karaf.log but not in new.
Guess am missing the DSL code and am not understanding it at all.
Can some one please help me show how to log to a specific appender i.e. new in this case.
Regards
- Ana
Reply | Threaded
Open this post in threaded view
|

Re: Using custom log4j appenders under Karaf 2.1.4

Echo
This post has NOT been accepted by the mailing list yet.
In reply to this post by mikevan
OMG I've got a notification regarding that post I made months ago and I've been flabbergasted when I noticed that I didn't even thank you @mikevan .... HOW RUDE I AM.
Please accept my apology.
Reply | Threaded
Open this post in threaded view
|

Re: Using custom log4j appenders under Karaf 2.1.4

amollin
This post has NOT been accepted by the mailing list yet.
In reply to this post by mgardiner
Hello there,

>"If you plan to use your own appenders, you need to create an OSGi bundle and attach it as a fragment to >the bundle with a symbolic name of org.ops4j.pax.logging.pax-logging-service. This way, the underlying >logging system will be able to see and use your appenders."

>We have a custom remoting logging appender we wish to utilize with our project hosted in Karaf 2.1.4.

>I am assuming we turn our appenders jar into a bundle with the fragment host set to >org.ops4j.pax.logging.pax-logging-service and deployed with our project.  Is that correct?

May I know the steps to create a OSGi bundle to meet the above needs.  I also have a Custom Appender to deploy in karaf container.

Thanks,
Ashok
Reply | Threaded
Open this post in threaded view
|

Setting Up logentries for OSGi Applications in Apache Karaf

Hendy Irawan
In reply to this post by mgardiner
Logentries is a nice way to track your server/application logs. Free account available. :)

I blogged this how to at :

http://spring-java-ee.blogspot.com/2012/12/setting-up-logentries-for-osgi.html

Hope this is useful.

Hendy