Could not load jdbc/mssql adapter: adapter class not registered in ADAPTER_MAP
from-nibly opened this issue · 3 comments
Logstash information:
- Logstash version:
8.0.0
- Logstash installation source:
docker
- How is Logstash being run:
docker
- How was the Logstash Plugin installed:
docker
Dockerfile
FROM logstash:8.0.0
RUN logstash-plugin install logstash-output-opensearch
# temp fix https://github.com/elastic/logstash/issues/13777
RUN sed --in-place "s/gem.add_runtime_dependency \"sinatra\", '~> 2'/gem.add_runtime_dependency \"sinatra\", '~> 2.1.0'/g" /usr/share/logstash/logstash-core/logstash-core.gemspec
RUN cd /usr/share/logstash/ && \
rm Gemfile.lock && \
/usr/share/logstash/bin/ruby -S /usr/share/logstash/vendor/bundle/jruby/2.5.0/bin/bundle install
RUN cd /usr/share/logstash/logstash-core/lib/jars && \
curl -L https://go.microsoft.com/fwlink/?linkid=2186164 -o sqljdbc.tar.gz && \
tar -xvf sqljdbc.tar.gz && \
mv ./sqljdbc_*/enu/* ./ && \
rm -rf ./sqljdbc*
- JVM version
openjdk 11.0.13 2021-10-19
- JVM installation source
docker
- Value of the
JAVA_HOME
environment variable if set.N/A
OS version
Linux nixos-rip 5.10.99 #1-NixOS SMP Tue Feb 8 17:30:41 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
Description of the problem including expected versus actual behavior:
When I try to run an jdbc input plugin with the mssql/sqlserver jdbc driver it throws thw following error
Could not load jdbc/mssql adapter: adapter class not registered in ADAPTER_MAP
Things I've Tried
- Using multiple versions of jdbc drivers provided by Microsoft.
- Omitting the jdbc_driver_libarary
- adding removing
Java::
from the class property - Only copying the jar into the
/usr/share/logstash/logstash-core/lib/jars
folder. - Running on Logstash v7 with a similar temporary hack for sinatra.
I've looked for documentation on this issue, and am coming up with dead ends on stack overflow et al.
I'm also having trouble finding information on what the ADAPTER_MAP is and how it gets populated.
Questions
- Is the adapter different from the driver?
- Is there a way to debug what is in the ADAPTER_MAP?
- Am I installing the jdbc driver correctly?
- Is there any documentation for specifically mssql/sqlserver and it's particulars?
Expectations
I would expect it to work with the configuration provided, assuming I'm not doing something obviously dumb here.
Steps to reproduce:
Dockerfile is above.
pipeline file mounted to /usr/share/logstash/pipeline/
input {
jdbc {
jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/mssql-jdbc-10.2.0.jre11.jar"
jdbc_driver_class => "Java::com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "${DB_CONNECTION_STRING}"
jdbc_user => "${DB_USER}"
jdbc_password => "${DB_PASSWORD}"
schedule => "* * * * *"
statement_filepath => "/opt/test.sql"
}
}
output {
opensearch {
id => "***REDACTED***"
ecs_compatibility => "v1"
hosts => ["${ES_HOST}"]
index => "***REDACTED***"
doc_as_upsert => true
action => "%{[action]}"
user => "${ES_USER}"
password => "${ES_PASSWORD}"
ssl => true
}
}
docker command
docker run -ti -v $(pwd)/pipeline/:/usr/share/logstash/pipeline/ -v $(pwd)/test.sql:/opt/test.sql -e DB_CONNECTION_STRING -e DB_USER -e DB_PASSWORD -e DB_HOST -e ES_HOST -e ES_USER -e ES_PASSWORD --net=host ${IMAGE_ID}
Log with error
[2022-02-23T17:09:00,477][ERROR][logstash.pluginmixins.jdbc.scheduler][main][3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769] Scheduler intercepted an error: {:exception=>Sequel::AdapterNotFound, :message=>"Could not load jdbc/mssql adapter: adapter class not registered in ADAPTER_MAP", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/database/connecting.rb:97:in load_adapter'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/adapters/jdbc.rb:378:in
adapter_initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/database/misc.rb:156:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/database/connecting.rb:57:in
connect'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/sequel-5.53.0/lib/sequel/core.rb:124:in connect'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/plugin_mixins/jdbc/jdbc.rb:117:in
block in jdbc_connect'", "org/jruby/RubyKernel.java:1442:in loop'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/plugin_mixins/jdbc/jdbc.rb:114:in
jdbc_connect'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/plugin_mixins/jdbc/jdbc.rb:157:in open_jdbc_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/plugin_mixins/jdbc/jdbc.rb:214:in
execute_statement'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/inputs/jdbc.rb:345:in execute_query'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/inputs/jdbc.rb:308:in
block in run'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:234:in do_call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:258:in
do_trigger'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:300:in block in start_work_thread'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:299:in
block in start_work_thread'", "org/jruby/RubyKernel.java:1442:in loop'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler/jobs.rb:289:in
block in start_work_thread'"], :now=>"2022-02-23T17:09:00.476", :last_time=>"2022-02-23T17:09:00.473", :next_time=>"2022-02-23T17:10:00.000", :job=>#<Rufus::Scheduler::CronJob:0x3f33c406 @last_at=nil, @tags=[], @scheduled_at=2022-02-23 17:06:32 +0000, @cron_line=#<Rufus::Scheduler::CronLine:0x381e0c59 @Timezone=nil, @weekdays=nil, @DayS=nil, @seconds=[0], @minutes=nil, @Hours=nil, @months=nil, @monthdays=nil, @original="* * * * ">, @last_time=2022-02-23 17:09:00 +0000, @times=nil, @Locals={}, @unscheduled_at=nil, @callable=#Proc:0x42862784@/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/inputs/jdbc.rb:307, @next_time=2022-02-23 17:10:00 +0000, @local_mutex=#Thread::Mutex:0x3c105f42, @mean_work_time=0.05812200000000001, @count=3, @last_work_time=0.006668, @Scheduler=#<LogStash::PluginMixins::Jdbc::Scheduler:0x74a4e05d @jobs=#<Rufus::Scheduler::JobArray:0x7dd41e87 @mutex=#Thread::Mutex:0xe9dbcbe, @array=[#<Rufus::Scheduler::CronJob:0x3f33c406 ...>]>, @scheduler_lock=#Rufus::Scheduler::NullLock:0x63bca02c, @started_at=2022-02-23 17:06:32 +0000, @thread=#<Thread:0x7b819e3a@[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]<jdbc__scheduler sleep>, @mutexes={}, @work_queue=#Thread::Queue:0x3ec1aafa, @frequency=1.0, @_work_threads=[#<Thread:0x25a89003@[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]<jdbc__scheduler_worker-00 run>], @Paused=false, @trigger_lock=#Rufus::Scheduler::NullLock:0x2393e95a, @opts={:max_work_threads=>1, :thread_name=>"[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]<jdbc__scheduler", :frequency=>1.0}, @thread_key="rufus_scheduler_2054", @max_work_threads=1, @stderr=#<IO:>>, @paused_at=nil, @first_at=2022-02-23 17:06:32 +0000, @opts={}, @id="cron_1645635992.289568_4810128455882558930", @handler=#Proc:0x42862784@/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-jdbc-5.2.3/lib/logstash/inputs/jdbc.rb:307, @original=" * * * *">, :opts=>{:max_work_threads=>1, :thread_name=>"[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]<jdbc__scheduler", :frequency=>1.0}, :started_at=>2022-02-23 17:06:32 +0000, :thread=>"#<Thread:0x7b819e3a@[3af9d16d69a0f0d79d853235cbb95ec80f0c515a4d7d685696d377c3b11b8769]<jdbc__scheduler sleep>", :jobs_size=>1, :work_threads_size=>1, :work_queue_size=>0}
Hey, there's nothing special about MSSQL and it's usually expected to work.
One thing worth also trying is to double check the added .jar permission in /usr/share/logstash/logstash-core/lib/jars
, are you sure the copying was done using the logstash user?
Maybe provide us the output of ls -l /usr/share/logstash/logstash-core/lib/jars
Is the adapter different from the driver?
The driver is the Java library that implements the JDBC standard.
Sequel, the underlying library, uses the adapter terminology - different DBs have different adapters that the library uses to adapt (and provide specific) behavior e.g. a database specific way to set limit/offset for a query.
Is there a way to debug what is in the ADAPTER_MAP?
Sequel library attempting to load the jdbc/mssql
should only happen as a fallback when the driver class is n/a.
Am I installing the jdbc driver correctly?
If LS has the proper permission to read the .jar in /usr/share/logstash/logstash-core/lib/jars
than yes.
Is there any documentation for specifically mssql/sqlserver and it's particulars?
Nothing specific, we know some users the pluging with SQLServer and are doing fine.
Hello @from-nibly
I am not sure if issue got resolve or not, but here my takes.
Once I configured mssql-server on centos7 and populated database with sample table and row/column.
I used below LS jdbc input configuration and I am able to see my events.
If you look at the sample code provided by Microsoft in archive, it is mentioned the drive string.
It should be "com.microsoft.sqlserver.jdbc.SQLServerDataSource"
input { jdbc { jdbc_driver_library => "/home/docker/sqljdbc_9.4/enu/mssql-jdbc-9.4.0.jre8.jar" jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDataSource" jdbc_connection_string => "jdbc:sqlserver://localhost:1433;databaseName=TestDB" jdbc_user => "sa" jdbc_password => "Mysql@2022" schedule => "*/1 * * * *" statement => "SELECT * from dbo.Inventory where quantity > 152" } }
Event look like below.
{ "quantity" => 154, "@timestamp" => 2022-07-14T10:40:02.955Z, "name" => "orange", "@version" => "1", "id" => 2, }
Where name/Quantity is my Table data.
Best Regards.
Apologies, for the late response, we ended up abandoning this shortly after running into these issues. I wont have time to try the solutions and see if they work.