I am trying to create a cluster with the following parameters on Google Cloud: 1 Master 7 Worker nodes Each of them with 1 vCPU The master node should get full SSD capacity and the worker nodes should get equal shares of standard disk capacity. This is my code: This is my error: Updated attempt: I don’t follow what I
Tag: cluster-computing
snakemake – accessing config variables from cluster submission wrapper
I am using a cluster submission wrapper script with snakemake –cluster “python qsub_script.py”. I need to pass a global variable, taken from the config[‘someVar’]. This should be applied to all rules. I could add it to the params of each rule, and then access it using job_properties[‘params’][‘someVar’], but this is probably not the best solution. Is there a way to
install python packages using init scripts in a databricks cluster
I have installed the databricks cli tool by running the following command pip install databricks-cli using the appropriate version of pip for your Python installation. If you are using Python 3, run pip3. Then by creating a PAT (personal-access token in Databricks) I run the following .sh bash script: python_dependencies.sh script I use the above script to install python libraries