Skip to content

Proof-of-concept SAP data source for Spark

License

Notifications You must be signed in to change notification settings

contiamo/spark-datasource-sap

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SAP data source for Spark

Warning

This repository is not maintained at the moment and CI action is disabled. Active maintenance and CI make sense only when there's access to an SAP system (there isn't right now).

Supported features

This was a POC project, here is what it can do:

  • Retrieve schemas for a list of tables
  • Read a table using RFC_READ_TABLE (within its limitations)
    • Allows configuring an alternative read table function
    • Some filter expressions from where clause could be pused down
  • Call a BAPI and return selected export parameters or tables

For usage examples see src/test/scala/SapSparkDatasourceIntegrationSpec.scala.

Known issues

Getting the SAP JCo driver

  • Get the driver from your SAP vendor or download it from SAP
  • Put the driver's jar, so and jnilib into the lib folder
  • Rename the files as explained here

Running tests

  • Get the SAP JCo JNI libraries (see above)
  • Set SAP credentials via the environment variables:
    export SAP_AHOST="/H/saprouter.example.com/S/0000/H/abcd/S/1111" SAP_USER=MYUSER SAP_PASSWORD=VERYSECRET
  • Run sbt test
  • Some tests relied on the contents of the test SAP system we had access to.

About

Proof-of-concept SAP data source for Spark

Resources

License

Stars

Watchers

Forks

Languages