This guide explains the most important Cloud Workstations behaviors before installing tools.
~/) and what does not.503, auth, ADC).If you prefer terminal-only access, use the Google Cloud CLI.
General flow:
# 1) ensure gcloud is authenticated
gcloud auth login
# 2) start workstation (if stopped)
gcloud workstations start WORKSTATION_ID --region=REGION --cluster=CLUSTER_ID --config=CONFIG_ID
# 3) open SSH session
gcloud workstations ssh WORKSTATION_ID --region=REGION --cluster=CLUSTER_ID --config=CONFIG_ID
Use the official connect guide above for exact command variants in your environment.
After launching the workstation, open the forwarded port URL from the Workstations UI.
8888 JupyterLab8787 RStudio Server8080 Label Studio / annotation tools8443 code-serverIf you get 503, wait 30-120 seconds and check service status/logs.
In the workstation menu, you may see options like these:
8888, 8787, 8080, 8443). Use this when your app is already running.Rule of thumb:
Cloud Workstations web access is protected by IAM. Most local services behind forwarded ports do not need separate external auth setup.
Only your home directory (~/) persists reliably across restarts. Install scripts in this repo store data/config in persistent paths.
Services should bind to 0.0.0.0 and then be accessed through Workstation forwarded URLs.
Common ports in this repo:
8080 Label Studio / annotation tools8443 code-server8787 RStudio Server8888 JupyterLabSome services autostart at boot; others are started on demand. If a port shows 503 immediately after restart, wait 30-120 seconds and check service status/logs.
gcloud storage)If auth is missing or expired, run:
# user auth for gcloud CLI
gcloud auth login --no-launch-browser
# application default credentials (used by SDKs/tools)
gcloud auth application-default login --no-launch-browser
# verify auth and project context
gcloud auth list
gcloud config get-value project
# test GCS access
gcloud storage ls gs://YOUR-BUCKET/ | head
# list bucket contents
gcloud storage ls gs://YOUR-BUCKET/
# copy file to bucket
gcloud storage cp local_file.csv gs://YOUR-BUCKET/path/
# copy folder recursively
gcloud storage cp --recursive ~/data gs://YOUR-BUCKET/data/
# sync local folder -> bucket folder
gcloud storage rsync ~/data gs://YOUR-BUCKET/data --recursive
gsutil equivalents (still widely used)# list bucket
gsutil ls gs://YOUR-BUCKET/
# parallel copy (faster for many files)
gsutil -m cp -r ~/data gs://YOUR-BUCKET/
# sync local folder -> bucket folder
gsutil -m rsync -r ~/data gs://YOUR-BUCKET/data
If tools need Google Cloud APIs (gs://, storage SDKs, etc.), configure ADC:
curl -sL https://raw.githubusercontent.com/MichaelAkridge-NOAA/optics-si-cloud-tools/main/scripts/setup_gcloud_adc.sh | bash
For advanced scope needs, run gcloud auth application-default login with explicit scopes.
Use this as a practical command reference for daily workstation usage.
If you're new to Linux, these are the most useful day-1 commands.
# where am I?
pwd
# list files
ls
ls -la
# change directory
cd ~/ # go to home
cd .. # go up one folder
# create/remove folders
mkdir my_folder
rm -rf old_folder
# print file contents
cat README.md
# view first/last lines
head -20 file.txt
tail -20 file.txt
# search inside files
grep -i "label" file.txt
# download file
wget https://example.com/file.sh
# or with curl
curl -L -o file.sh https://example.com/file.sh
# show running processes
ps aux | head
# kill process by name
pkill -f jupyter
# use tmux for persistent terminal sessions
tmux new -s work
tmux attach -t work
# zip a folder
zip -r archive.zip my_folder/
# unzip
unzip archive.zip
# make script executable
chmod +x script.sh
# run script
./script.sh
# show command history
history
# show last 50 commands
history 50
# search command history
history | grep docker
# rerun previous command
!!
# rerun command by history number
!123
Optional: append history across sessions (add to ~/.bashrc):
shopt -s histappend
PROMPT_COMMAND='history -a'
# copy/move files
cp source.txt dest.txt
mv old_name.txt new_name.txt
# copy folders recursively
cp -r data/ backup_data/
# check sizes and free space
du -sh ~/data
df -h
# create tar.gz archive
tar -czf dataset_backup.tar.gz ~/data
# extract archive
tar -xzf dataset_backup.tar.gz
workstation_health.sh
workstation_cleanup.sh --dry-run
workstation_backup.sh list gs://YOUR-BUCKET/PATH