audienceproject/spark-dynamodb
Plug-and-play implementation of an Apache Spark custom data source for AWS DynamoDB.
ScalaApache-2.0
Issues
- 1
- 2
- 9
issues infering schema
#18 opened by lepirlouit - 2
java.lang.NoClassDefFoundError: com/audienceproject/spark/dynamodb/implicits$
#25 opened by mjfroehlich - 3
Write binary column types as output
#41 opened by sb2nov - 2
targetCapacity when executors != readPartitions
#24 opened by protometa - 11
Stop using guava
#35 opened by AceHack - 7
Issues with Guava on EMR?
#29 opened by anuras - 11
On demand capacity
#32 opened by liorwinner - 3
Dynamo DB Connector configuration
#38 opened by venkatwilliams - 3
java.lang.NoClassDefFoundError: org/spark_project/guava/util/concurrent/RateLimiter
#4 opened by yeungp - 0
Capacity calculations are made eagerly
#37 opened by cosmincatalin - 1
- 6
- 1
Add git tags
#23 opened by aelesbao - 1
Can't change the region
#27 opened by anuras - 4
Performance issue writing in DynamoDB
#30 opened by lucienfregosi - 4
V0.4.0 tanks write performance
#22 opened by colemanja91 - 5
Write nested Object in DynamoDB
#20 opened by lucienfregosi - 10
- 1
java.lang.IllegalArgumentException: Spark DataType 'decimal(19,0)' could not be mapped to a corresponding DynamoDB data type.
#14 opened by lepirlouit - 0
Adding support for Binary Colums
#16 opened by Neuw84 - 1
- 9
No data retrieved from table
#6 opened by bmgraff