Write to a Collection
The Macrometa Collections Databricks Connector allows you to integrate Apache Spark with Macrometa collections, allows you to write data to Macrometa collections using Apache Spark.
- Set up your target options: - val targetOptions = Map(
 "regionUrl" -> "<REGION_URL>",
 "apiKey" -> "apikey <API_KEY>",
 "fabric" -> "<FABRIC>",
 "collection" -> "<COLLECTION>",
 "batchSize" -> "<BATCH_SIZE>",
 "primaryKey" -> "<PRIMARY_KEY>"
 )
- Write to the Macrometa collection: - modifiedDF
 .write
 .format("com.macrometa.spark.collection.MacrometaTableProvider")
 .options(targetOptions)
 .mode(SaveMode.Append)
 .save()
- Close SparkSession: - spark.close()