ArangoDB Spark Connector - Scala Reference
This library has been deprecated in favor of the new ArangoDB Datasource for Apache Spark.
ArangoSpark.save
ArangoSpark.save[T](rdd: RDD[T], collection: String, options: WriteOptions)
ArangoSpark.save[T](dataset: Dataset[T], collection: String, options: WriteOptions)
Save data from rdd or dataset into ArangoDB
Arguments
-
rdd / dataset:
RDD[T]orDataset[T]The rdd or dataset with the data to save
-
collection:
StringThe collection to save in
-
options:
WriteOptions-
database:
StringDatabase to write into
-
hosts:
StringAlternative hosts to context property
arangodb.hosts -
user:
StringAlternative user to context property
arangodb.user -
password:
StringAlternative password to context property
arangodb.password -
useSsl:
BooleanAlternative useSsl to context property
arangodb.useSsl -
sslKeyStoreFile:
StringAlternative sslKeyStoreFile to context property
arangodb.ssl.keyStoreFile -
sslPassPhrase:
StringAlternative sslPassPhrase to context property
arangodb.ssl.passPhrase -
sslProtocol:
StringAlternative sslProtocol to context property
arangodb.ssl.protocol -
method:
WriteOptions.MethodWrite method to use, it can be one of:
WriteOptions.INSERTWriteOptions.UPDATEWriteOptions.REPLACE
-
Examples
val sc: SparkContext = ...
val documents = sc.parallelize((1 to 100).map { i => MyBean(i) })
ArangoSpark.save(documents, "myCollection", WriteOptions("myDB"))
ArangoSpark.saveDF
ArangoSpark.saveDF(dataframe: DataFrame, collection: String, options: WriteOptions)
Save data from dataframe into ArangoDB
Arguments
-
dataframe: DataFrame`
The dataFrame with the data to save
-
collection:
StringThe collection to save in
-
options:
WriteOptions-
database:
StringDatabase to write into
-
hosts:
StringAlternative hosts to context property
arangodb.hosts -
user:
StringAlternative user to context property
arangodb.user -
password:
StringAlternative password to context property
arangodb.password -
useSsl:
BooleanAlternative useSsl to context property
arangodb.useSsl -
sslKeyStoreFile:
StringAlternative sslKeyStoreFile to context property
arangodb.ssl.keyStoreFile -
sslPassPhrase:
StringAlternative sslPassPhrase to context property
arangodb.ssl.passPhrase -
sslProtocol:
StringAlternative sslProtocol to context property
arangodb.ssl.protocol -
method:
WriteOptions.MethodWrite method to use, it can be one of:
WriteOptions.INSERTWriteOptions.UPDATEWriteOptions.REPLACE
-
Examples
val sc: SparkContext = ...
val documents = sc.parallelize((1 to 100).map { i => MyBean(i) })
val sql: SQLContext = SQLContext.getOrCreate(sc);
val df = sql.createDataFrame(documents, classOf[MyBean])
ArangoSpark.saveDF(df, "myCollection", WriteOptions("myDB"))
ArangoSpark.load
ArangoSpark.load[T: ClassTag](sparkContext: SparkContext, collection: String, options: ReadOptions): ArangoRDD[T]
Load data from ArangoDB into rdd
Arguments
-
sparkContext:
SparkContextThe sparkContext containing the ArangoDB configuration
-
collection:
StringThe collection to load data from
-
options:
ReadOptions-
database:
StringDatabase to write into
-
hosts:
StringAlternative hosts to context property
arangodb.hosts -
user:
StringAlternative user to context property
arangodb.user -
password:
StringAlternative password to context property
arangodb.password -
useSsl:
BooleanAlternative useSsl to context property
arangodb.useSsl -
sslKeyStoreFile:
StringAlternative sslKeyStoreFile to context property
arangodb.ssl.keyStoreFile -
sslPassPhrase:
StringAlternative sslPassPhrase to context property
arangodb.ssl.passPhrase -
sslProtocol:
StringAlternative sslProtocol to context property
arangodb.ssl.protocol
-
Examples
val sc: SparkContext = ...
val rdd = ArangoSpark.load[MyBean](sc, "myCollection", ReadOptions("myDB"))
ArangoRDD.filter
ArangoRDD.filter(condition: String): ArangoRDD[T]
Adds a filter condition. If used multiple times, the conditions will be combined with a logical AND.
Arguments
-
condition:
StringThe condition for the filter statement. Use
docinside to reference the document. e.g."doc.name == 'John'"
Examples
val sc: SparkContext = ...
val rdd = ArangoSpark.load[MyBean](sc, "myCollection").filter("doc.name == 'John'")
Spark Streaming Integration
RDDs can also be saved to ArangoDB from Spark Streaming using ArangoSpark.save().
Example
dStream.foreachRDD(rdd =>
ArangoSpark.save(rdd, COLLECTION, new WriteOptions().database(DB)))