For those who prefer to use code, create a notebook (example below) and send the output via the Prophecy Support Portal.
Replace the workspace URL, personal access token, clusterID, and API token as appropriate.
Python
Copy
# Databricks notebook sourceimport requests#Get Databricks runtime of cluster# Get the notebook context using dbutilscontext = dbutils.notebook.entry_point.getDbutils().notebook().getContext()# Retrieve the Databricks runtime version from the context tagsruntime_version = context.tags().get("sparkVersion").get()# Print the runtime versionprint(f"Databricks Runtime Version: {runtime_version}")# Get Spark versionspark_version = spark.versionprint(f"Spark Version: {spark_version}")#Get the installed libraries and access mode details of the cluster# Replace with your Databricks workspace URL and tokenworkspace_url = "replace_with_workspace_url"token = "replace_with_token"cluster_id = "replace_with_cluster_id"# API endpoint to get info of installed librariesurl = f"{workspace_url}/api/2.0/libraries/cluster-status"# Make the API requestresponse = requests.get(url, headers={"Authorization": f"Bearer {token}"}, params={"cluster_id": cluster_id})library_info=response.json()print("Libraries:")for i in library_info['library_statuses']: print(i)# API endpoint to get access mode detailsurl = f"{workspace_url}/api/2.1/clusters/get"# Make the API requestresponse = requests.get(url, headers={"Authorization": f"Bearer {token}"}, params={"cluster_id": cluster_id})cluster_access_info=response.json()print(f"Cluster Access Mode: {cluster_access_info['data_security_mode']}")
%scalaimport sys.process._val command = """curl -X GET "https://customer_prophecy_url/execution""""Seq("/bin/bash", "-c", command).!!
This command tests the reverse websocket protocol required by Prophecy to execute pipelines on Spark clusters. Please send the output from this command in the Support Portal.We look forward to hearing from you!