Skip to content

fix(docs): Update getting-started script pre-26.3.0#667

Merged
NickLarsenNZ merged 2 commits intomainfrom
fix/getting-started-pre-26.3.0
Mar 12, 2026
Merged

fix(docs): Update getting-started script pre-26.3.0#667
NickLarsenNZ merged 2 commits intomainfrom
fix/getting-started-pre-26.3.0

Conversation

@NickLarsenNZ
Copy link
Member

Check and Update Getting Started Script

Part of stackabletech/issues#826

Note

During a Stackable release we need to check (and optionally update) the
getting-started scripts to ensure they still work after product and operator
updates.

# Some of the scripts are in a code/ subdirectory
# pushd docs/modules/superset/examples/getting_started
# pushd docs/modules/superset/examples/getting_started/code
pushd $(fd -td getting_started | grep examples); cd code 2>/dev/null || true

# Make a fresh cluster (~12 seconds)
kind delete cluster && kind create cluster
./getting_started.sh stackablectl

# Make a fresh cluster (~12 seconds)
kind delete cluster && kind create cluster
./getting_started.sh helm

popd

@NickLarsenNZ
Copy link
Member Author

The script fails. No pod with the label spark-role: driver

See completed pod logs:

logs
+ containerdebug --output=/stackable/log/containerdebug-state.json --loop
+ /stackable/spark/bin/spark-submit --verbose --master k8s://https://10.96.0.1:443 --deploy-mode cluster --name pyspark-pi --conf spark.kubernetes.driver.podTemplateFile=/stackable/spark/driver-pod-templates/template.yaml --conf spark.kubernetes.executor.podTemplateFile=/stackable/spark/executor-pod-templates/template.yaml --conf spark.kubernetes.driver.podTemplateContainerName=spark --conf spark.kubernetes.executor.podTemplateContainerName=spark --conf spark.kubernetes.namespace=default --conf spark.kubernetes.driver.container.image=oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev --conf spark.kubernetes.executor.container.image=oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev --conf spark.driver.defaultJavaOptions=-Dlog4j.configurationFile=/stackable/log_config/log4j2.properties --conf 'spark.driver.extraClassPath=/stackable/spark/extra-jars/*' --conf spark.executor.defaultJavaOptions=-Dlog4j.configurationFile=/stackable/log_config/log4j2.properties --conf 'spark.executor.extraClassPath=/stackable/spark/extra-jars/*' --conf spark.driver.extraJavaOptions=-Djava.security.properties=/stackable/log_config/security.properties --conf spark.executor.extraJavaOptions=-Djava.security.properties=/stackable/log_config/security.properties --conf 'spark.metrics.conf.*.sink.prometheusServlet.class=org.apache.spark.metrics.sink.PrometheusServlet' --conf 'spark.metrics.conf.*.sink.prometheusServlet.path=/metrics/prometheus' --conf spark.ui.prometheus.enabled=true --conf spark.sql.streaming.metricsEnabled=true --conf spark.driver.cores=2 --conf spark.driver.memory=640m --conf spark.executor.cores=2 --conf spark.executor.instances=1 --conf spark.executor.memory=640m --conf spark.kubernetes.driver.limit.cores=2 --conf spark.kubernetes.driver.request.cores=1 --conf spark.kubernetes.executor.limit.cores=2 --conf spark.kubernetes.executor.request.cores=1 --conf spark.kubernetes.memoryOverheadFactor=0.0 local:///stackable/spark/examples/src/main/python/pi.py
�[2m2026-03-11T14:07:45.943509Z�[0m �[32m INFO�[0m �[1mcontainerdebug init�[0m�[2m:�[0m �[2mcontainerdebug�[0m�[2m:�[0m Starting containerdebug �[3mbuilt_info.pkg_version�[0m�[2m=�[0m"0.3.0" �[3mbuilt_info.git_version�[0m�[2m=�[0m"0.3.0" �[3mbuilt_info.target�[0m�[2m=�[0m"x86_64-unknown-linux-gnu" �[3mbuilt_info.built_time_utc�[0m�[2m=�[0m"Tue, 10 Mar 2026 08:49:22 +0000" �[3mbuilt_info.rustc_version�[0m�[2m=�[0m"rustc 1.93.0 (254b59607 2026-01-19)"
�[2m2026-03-11T14:07:45.944708Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mResources::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::resources�[0m�[2m:�[0m cpus �[3mcpus.physical�[0m�[2m=�[0m20 �[3mcpus.cores.physical�[0m�[2m=�[0m14
�[2m2026-03-11T14:07:45.944723Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mResources::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::resources�[0m�[2m:�[0m open files limit �[3mopen_files.limit�[0m�[2m=�[0m2147483584
�[2m2026-03-11T14:07:45.944728Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mResources::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::resources�[0m�[2m:�[0m memory �[3mmemory.total�[0m�[2m=�[0m67064463360 �[3mmemory.free�[0m�[2m=�[0m16917442560 �[3mmemory.available�[0m�[2m=�[0m55302832128 �[3mmemory.used�[0m�[2m=�[0m11761631232
�[2m2026-03-11T14:07:45.944733Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mResources::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::resources�[0m�[2m:�[0m swap �[3mswap.total�[0m�[2m=�[0m17179865088 �[3mswap.free�[0m�[2m=�[0m13728563200 �[3mswap.used�[0m�[2m=�[0m3451301888
�[2m2026-03-11T14:07:45.944818Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mResources::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::resources�[0m�[2m:�[0m cgroup memory �[3mcgroup.memory.total�[0m�[2m=�[0m1073741824 �[3mcgroup.memory.free�[0m�[2m=�[0m1069076480 �[3mcgroup.swap.free�[0m�[2m=�[0m17179865088
�[2m2026-03-11T14:07:45.944910Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mOperatingSystem::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::os�[0m�[2m:�[0m operating system �[3mos.name�[0m�[2m=�[0m"Red Hat Enterprise Linux" �[3mos.kernel.version�[0m�[2m=�[0m"6.18.7" �[3mos.version�[0m�[2m=�[0m"Linux (Red Hat Enterprise Linux 9.7)" �[3mos.host_name�[0m�[2m=�[0m"pyspark-pi-tln58" �[3mos.cpu_arch�[0m�[2m=�[0m"x86_64"
�[2m2026-03-11T14:07:45.946938Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mUser::collect_current�[0m�[2m:�[0m �[2mcontainerdebug::system_information::user�[0m�[2m:�[0m current user �[3muser.name�[0m�[2m=�[0m"stackable" �[3muser.uid�[0m�[2m=�[0m"Uid(1000)" �[3muser.gid�[0m�[2m=�[0m"Uid(1000)"
�[2m2026-03-11T14:07:45.947344Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mDisk::collect_all�[0m�[2m:�[0m �[2mcontainerdebug::system_information::disk�[0m�[2m:�[0m found disk �[3mdisk.mount_point�[0m�[2m=�[0m"/" �[3mdisk.name�[0m�[2m=�[0m"overlay" �[3mdisk.space.total�[0m�[2m=�[0m3915662131200 �[3mdisk.space.available�[0m�[2m=�[0m2306362675200
�[2m2026-03-11T14:07:45.947353Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mDisk::collect_all�[0m�[2m:�[0m �[2mcontainerdebug::system_information::disk�[0m�[2m:�[0m found disk �[3mdisk.mount_point�[0m�[2m=�[0m"/stackable/log" �[3mdisk.name�[0m�[2m=�[0m"/dev/mapper/lvmroot-root" �[3mdisk.space.total�[0m�[2m=�[0m3915662131200 �[3mdisk.space.available�[0m�[2m=�[0m2306362675200
�[2m2026-03-11T14:07:45.947359Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mDisk::collect_all�[0m�[2m:�[0m �[2mcontainerdebug::system_information::disk�[0m�[2m:�[0m found disk �[3mdisk.mount_point�[0m�[2m=�[0m"/etc/hosts" �[3mdisk.name�[0m�[2m=�[0m"/dev/mapper/lvmroot-root" �[3mdisk.space.total�[0m�[2m=�[0m3915662131200 �[3mdisk.space.available�[0m�[2m=�[0m2306362675200
�[2m2026-03-11T14:07:45.947365Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mDisk::collect_all�[0m�[2m:�[0m �[2mcontainerdebug::system_information::disk�[0m�[2m:�[0m found disk �[3mdisk.mount_point�[0m�[2m=�[0m"/dev/termination-log" �[3mdisk.name�[0m�[2m=�[0m"/dev/mapper/lvmroot-root" �[3mdisk.space.total�[0m�[2m=�[0m3915662131200 �[3mdisk.space.available�[0m�[2m=�[0m2306362675200
�[2m2026-03-11T14:07:45.947370Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mDisk::collect_all�[0m�[2m:�[0m �[2mcontainerdebug::system_information::disk�[0m�[2m:�[0m found disk �[3mdisk.mount_point�[0m�[2m=�[0m"/etc/hostname" �[3mdisk.name�[0m�[2m=�[0m"/dev/mapper/lvmroot-root" �[3mdisk.space.total�[0m�[2m=�[0m3915662131200 �[3mdisk.space.available�[0m�[2m=�[0m2306362675200
�[2m2026-03-11T14:07:45.947375Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mDisk::collect_all�[0m�[2m:�[0m �[2mcontainerdebug::system_information::disk�[0m�[2m:�[0m found disk �[3mdisk.mount_point�[0m�[2m=�[0m"/etc/resolv.conf" �[3mdisk.name�[0m�[2m=�[0m"/dev/mapper/lvmroot-root" �[3mdisk.space.total�[0m�[2m=�[0m3915662131200 �[3mdisk.space.available�[0m�[2m=�[0m2306362675200
�[2m2026-03-11T14:07:45.947381Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mDisk::collect_all�[0m�[2m:�[0m �[2mcontainerdebug::system_information::disk�[0m�[2m:�[0m found disk �[3mdisk.mount_point�[0m�[2m=�[0m"/stackable/spark/driver-pod-templates" �[3mdisk.name�[0m�[2m=�[0m"/dev/mapper/lvmroot-root" �[3mdisk.space.total�[0m�[2m=�[0m3915662131200 �[3mdisk.space.available�[0m�[2m=�[0m2306362675200
�[2m2026-03-11T14:07:45.947386Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mDisk::collect_all�[0m�[2m:�[0m �[2mcontainerdebug::system_information::disk�[0m�[2m:�[0m found disk �[3mdisk.mount_point�[0m�[2m=�[0m"/stackable/spark/executor-pod-templates" �[3mdisk.name�[0m�[2m=�[0m"/dev/mapper/lvmroot-root" �[3mdisk.space.total�[0m�[2m=�[0m3915662131200 �[3mdisk.space.available�[0m�[2m=�[0m2306362675200
�[2m2026-03-11T14:07:45.947786Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m found network interface �[3mnetwork.interface.name�[0m�[2m=�[0m"lo" �[3mnetwork.interface.address�[0m�[2m=�[0m127.0.0.1
�[2m2026-03-11T14:07:45.947798Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m found network interface �[3mnetwork.interface.name�[0m�[2m=�[0m"eth0" �[3mnetwork.interface.address�[0m�[2m=�[0m10.244.0.11
�[2m2026-03-11T14:07:45.947803Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m found network interface �[3mnetwork.interface.name�[0m�[2m=�[0m"lo" �[3mnetwork.interface.address�[0m�[2m=�[0m::1
�[2m2026-03-11T14:07:45.947809Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m found network interface �[3mnetwork.interface.name�[0m�[2m=�[0m"eth0" �[3mnetwork.interface.address�[0m�[2m=�[0mfe80::7c54:97ff:fe13:a72f
�[2m2026-03-11T14:07:45.947816Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m ip addresses �[3mnetwork.addresses.ip�[0m�[2m=�[0m{10.244.0.11, 127.0.0.1, ::1, fe80::7c54:97ff:fe13:a72f}
�[2m2026-03-11T14:07:45.968333Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m performed reverse DNS lookup for IP �[3mip�[0m�[2m=�[0m10.244.0.11 �[3mhostnames�[0m�[2m=�[0m["pyspark-pi-tln58."]
�[2m2026-03-11T14:07:45.968367Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m performed reverse DNS lookup for IP �[3mip�[0m�[2m=�[0m127.0.0.1 �[3mhostnames�[0m�[2m=�[0m["localhost."]
�[2m2026-03-11T14:07:45.968378Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m performed reverse DNS lookup for IP �[3mip�[0m�[2m=�[0m::1 �[3mhostnames�[0m�[2m=�[0m["ip6-loopback.", "ip6-localhost.", "localhost."]
�[2m2026-03-11T14:07:45.968389Z�[0m �[33m WARN�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m reverse DNS lookup failed �[3mip�[0m�[2m=�[0mfe80::7c54:97ff:fe13:a72f �[3merror�[0m�[2m=�[0mproto error: no records found for Query { name: Name("f.2.7.a.3.1.e.f.f.f.7.9.4.5.c.7.0.0.0.0.0.0.0.0.0.0.0.0.0.8.e.f.ip6.arpa."), query_type: PTR, query_class: IN }
�[2m2026-03-11T14:07:45.968419Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m hostnames �[3mnetwork.addresses.hostname�[0m�[2m=�[0m{"ip6-localhost.", "ip6-loopback.", "localhost.", "pyspark-pi-tln58."}
�[2m2026-03-11T14:07:45.984735Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m performed forward DNS lookup for hostname �[3mhostname�[0m�[2m=�[0m"pyspark-pi-tln58." �[3mips�[0m�[2m=�[0m[10.244.0.11]
�[2m2026-03-11T14:07:45.984758Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m performed forward DNS lookup for hostname �[3mhostname�[0m�[2m=�[0m"localhost." �[3mips�[0m�[2m=�[0m[127.0.0.1]
�[2m2026-03-11T14:07:45.984763Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m performed forward DNS lookup for hostname �[3mhostname�[0m�[2m=�[0m"ip6-localhost." �[3mips�[0m�[2m=�[0m[::1]
�[2m2026-03-11T14:07:45.984768Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m�[1mSystemInformation::collect�[0m�[2m:�[0m�[1mSystemNetworkInfo::collect�[0m�[2m:�[0m �[2mcontainerdebug::system_information::network�[0m�[2m:�[0m performed forward DNS lookup for hostname �[3mhostname�[0m�[2m=�[0m"ip6-loopback." �[3mips�[0m�[2m=�[0m[::1]
�[2m2026-03-11T14:07:45.984866Z�[0m �[32m INFO�[0m �[1mcontainerdebug run�[0m�[2m:�[0m �[2mcontainerdebug�[0m�[2m:�[0m scheduling next run... �[3mnext_run�[0m�[2m=�[0mInstant { tv_sec: 278024, tv_nsec: 663283986 }
ERROR StatusLogger Reconfiguration failed: No configuration found for '5bc2b487' at 'null' in 'null'
ERROR StatusLogger Reconfiguration failed: No configuration found for 'Default' at 'null' in 'null'
Using properties file: null
Parsed arguments:
  master                  k8s://https://10.96.0.1:443
  remote                  null
  deployMode              cluster
  executorMemory          640m
  executorCores           2
  totalExecutorCores      null
  propertiesFile          null
  driverMemory            640m
  driverCores             2
  driverExtraClassPath    /stackable/spark/extra-jars/*
  driverExtraLibraryPath  null
  driverExtraJavaOptions  -Djava.security.properties=/stackable/log_config/security.properties
  supervise               false
  queue                   null
  numExecutors            1
  files                   null
  pyFiles                 null
  archives                null
  mainClass               null
  primaryResource         local:///stackable/spark/examples/src/main/python/pi.py
  name                    pyspark-pi
  childArgs               []
  jars                    null
  packages                null
  packagesExclusions      null
  repositories            null
  verbose                 true

Spark properties used, including those specified through
 --conf and those from the properties file null:
  (spark.driver.cores,2)
  (spark.driver.defaultJavaOptions,-Dlog4j.configurationFile=/stackable/log_config/log4j2.properties)
  (spark.driver.extraClassPath,/stackable/spark/extra-jars/*)
  (spark.driver.extraJavaOptions,-Djava.security.properties=/stackable/log_config/security.properties)
  (spark.driver.memory,640m)
  (spark.executor.cores,2)
  (spark.executor.defaultJavaOptions,-Dlog4j.configurationFile=/stackable/log_config/log4j2.properties)
  (spark.executor.extraClassPath,/stackable/spark/extra-jars/*)
  (spark.executor.extraJavaOptions,-Djava.security.properties=/stackable/log_config/security.properties)
  (spark.executor.instances,1)
  (spark.executor.memory,640m)
  (spark.kubernetes.driver.container.image,oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev)
  (spark.kubernetes.driver.limit.cores,2)
  (spark.kubernetes.driver.podTemplateContainerName,spark)
  (spark.kubernetes.driver.podTemplateFile,/stackable/spark/driver-pod-templates/template.yaml)
  (spark.kubernetes.driver.request.cores,1)
  (spark.kubernetes.executor.container.image,oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev)
  (spark.kubernetes.executor.limit.cores,2)
  (spark.kubernetes.executor.podTemplateContainerName,spark)
  (spark.kubernetes.executor.podTemplateFile,/stackable/spark/executor-pod-templates/template.yaml)
  (spark.kubernetes.executor.request.cores,1)
  (spark.kubernetes.memoryOverheadFactor,0.0)
  (spark.kubernetes.namespace,default)
  (spark.metrics.conf.*.sink.prometheusServlet.class,org.apache.spark.metrics.sink.PrometheusServlet)
  (spark.metrics.conf.*.sink.prometheusServlet.path,/metrics/prometheus)
  (spark.sql.streaming.metricsEnabled,true)
  (spark.ui.prometheus.enabled,true)

    
26/03/11 14:07:47 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Main class:
org.apache.spark.deploy.k8s.submit.KubernetesClientApplication
Arguments:
--primary-py-file
local:///stackable/spark/examples/src/main/python/pi.py
--main-class
org.apache.spark.deploy.PythonRunner
Spark config:
(spark.app.name,pyspark-pi)
(spark.app.submitTime,1773238067789)
(spark.driver.cores,2)
(spark.driver.defaultJavaOptions,-Dlog4j.configurationFile=/stackable/log_config/log4j2.properties)
(spark.driver.extraClassPath,/stackable/spark/extra-jars/*)
(spark.driver.extraJavaOptions,-Djava.security.properties=/stackable/log_config/security.properties)
(spark.driver.memory,640m)
(spark.executor.cores,2)
(spark.executor.defaultJavaOptions,-Dlog4j.configurationFile=/stackable/log_config/log4j2.properties)
(spark.executor.extraClassPath,/stackable/spark/extra-jars/*)
(spark.executor.extraJavaOptions,-Djava.security.properties=/stackable/log_config/security.properties)
(spark.executor.instances,1)
(spark.executor.memory,640m)
(spark.kubernetes.driver.container.image,oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev)
(spark.kubernetes.driver.limit.cores,2)
(spark.kubernetes.driver.podTemplateContainerName,spark)
(spark.kubernetes.driver.podTemplateFile,/stackable/spark/driver-pod-templates/template.yaml)
(spark.kubernetes.driver.request.cores,1)
(spark.kubernetes.executor.container.image,oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev)
(spark.kubernetes.executor.limit.cores,2)
(spark.kubernetes.executor.podTemplateContainerName,spark)
(spark.kubernetes.executor.podTemplateFile,/stackable/spark/executor-pod-templates/template.yaml)
(spark.kubernetes.executor.request.cores,1)
(spark.kubernetes.memoryOverheadFactor,0.0)
(spark.kubernetes.namespace,default)
(spark.master,k8s://https://10.96.0.1:443)
(spark.metrics.conf.*.sink.prometheusServlet.class,org.apache.spark.metrics.sink.PrometheusServlet)
(spark.metrics.conf.*.sink.prometheusServlet.path,/metrics/prometheus)
(spark.sql.streaming.metricsEnabled,true)
(spark.submit.deployMode,cluster)
(spark.submit.pyFiles,)
(spark.ui.prometheus.enabled,true)
Classpath elements:



26/03/11 14:07:47 INFO SparkKubernetesClientFactory: Auto-configuring K8S client using current context from users K8S config file
26/03/11 14:07:49 INFO KerberosConfDriverFeatureStep: You have not specified a krb5.conf file locally or via a ConfigMap. Make sure that you have the krb5.conf locally on the driver image.
26/03/11 14:07:49 INFO LoggingPodStatusWatcherImpl: State changed, new state: 
	 pod name: pyspark-pi-1215d79cdd39a2b4-driver
	 namespace: default
	 labels: app.kubernetes.io/component -> spark, app.kubernetes.io/instance -> pyspark-pi, app.kubernetes.io/managed-by -> spark.stackable.tech_sparkapplication, app.kubernetes.io/name -> spark-k8s, app.kubernetes.io/role-group -> sparkapplication, app.kubernetes.io/version -> 3.5.8-stackable0.0.0-dev, prometheus.io/scrape -> true, spark-app-name -> pyspark-pi, spark-app-selector -> spark-1f908fee91404eff86916ea993022d70, spark-role -> driver, spark-version -> 3.5.8-stackable0.0.0-dev, stackable.tech/vendor -> Stackable
	 pod uid: 2377186b-252f-4ca4-8ad4-fa25b2762379
	 creation time: 2026-03-11T14:07:49Z
	 service account name: pyspark-pi
	 volumes: log, log-config, config, pod-template-volume, spark-local-dir-1, spark-conf-volume-driver, kube-api-access-zv7dj
	 node name: kind-control-plane
	 start time: 2026-03-11T14:07:49Z
	 phase: Pending
	 container status: 
		 container name: spark
		 container image: oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev
		 container state: waiting
		 pending reason: ContainerCreating
26/03/11 14:07:49 INFO LoggingPodStatusWatcherImpl: Waiting for application pyspark-pi with application ID spark-1f908fee91404eff86916ea993022d70 and submission ID default:pyspark-pi-1215d79cdd39a2b4-driver to finish...
26/03/11 14:07:49 INFO LoggingPodStatusWatcherImpl: State changed, new state: 
	 pod name: pyspark-pi-1215d79cdd39a2b4-driver
	 namespace: default
	 labels: app.kubernetes.io/component -> spark, app.kubernetes.io/instance -> pyspark-pi, app.kubernetes.io/managed-by -> spark.stackable.tech_sparkapplication, app.kubernetes.io/name -> spark-k8s, app.kubernetes.io/role-group -> sparkapplication, app.kubernetes.io/version -> 3.5.8-stackable0.0.0-dev, prometheus.io/scrape -> true, spark-app-name -> pyspark-pi, spark-app-selector -> spark-1f908fee91404eff86916ea993022d70, spark-role -> driver, spark-version -> 3.5.8-stackable0.0.0-dev, stackable.tech/vendor -> Stackable
	 pod uid: 2377186b-252f-4ca4-8ad4-fa25b2762379
	 creation time: 2026-03-11T14:07:49Z
	 service account name: pyspark-pi
	 volumes: log, log-config, config, pod-template-volume, spark-local-dir-1, spark-conf-volume-driver, kube-api-access-zv7dj
	 node name: kind-control-plane
	 start time: 2026-03-11T14:07:49Z
	 phase: Pending
	 container status: 
		 container name: spark
		 container image: oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev
		 container state: waiting
		 pending reason: ContainerCreating
26/03/11 14:07:50 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Pending)
26/03/11 14:07:50 INFO LoggingPodStatusWatcherImpl: State changed, new state: 
	 pod name: pyspark-pi-1215d79cdd39a2b4-driver
	 namespace: default
	 labels: app.kubernetes.io/component -> spark, app.kubernetes.io/instance -> pyspark-pi, app.kubernetes.io/managed-by -> spark.stackable.tech_sparkapplication, app.kubernetes.io/name -> spark-k8s, app.kubernetes.io/role-group -> sparkapplication, app.kubernetes.io/version -> 3.5.8-stackable0.0.0-dev, prometheus.io/scrape -> true, spark-app-name -> pyspark-pi, spark-app-selector -> spark-1f908fee91404eff86916ea993022d70, spark-role -> driver, spark-version -> 3.5.8-stackable0.0.0-dev, stackable.tech/vendor -> Stackable
	 pod uid: 2377186b-252f-4ca4-8ad4-fa25b2762379
	 creation time: 2026-03-11T14:07:49Z
	 service account name: pyspark-pi
	 volumes: log, log-config, config, pod-template-volume, spark-local-dir-1, spark-conf-volume-driver, kube-api-access-zv7dj
	 node name: kind-control-plane
	 start time: 2026-03-11T14:07:49Z
	 phase: Running
	 container status: 
		 container name: spark
		 container image: oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev
		 container state: running
		 container started at: 2026-03-11T14:07:50Z
26/03/11 14:07:51 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Running)
26/03/11 14:07:52 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Running)
26/03/11 14:07:53 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Running)
26/03/11 14:07:54 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Running)
26/03/11 14:07:55 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Running)
26/03/11 14:07:56 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Running)
26/03/11 14:07:57 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Running)
26/03/11 14:07:58 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Running)
26/03/11 14:07:59 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Running)
26/03/11 14:08:00 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Running)
26/03/11 14:08:01 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Running)
26/03/11 14:08:01 INFO LoggingPodStatusWatcherImpl: State changed, new state: 
	 pod name: pyspark-pi-1215d79cdd39a2b4-driver
	 namespace: default
	 labels: app.kubernetes.io/component -> spark, app.kubernetes.io/instance -> pyspark-pi, app.kubernetes.io/managed-by -> spark.stackable.tech_sparkapplication, app.kubernetes.io/name -> spark-k8s, app.kubernetes.io/role-group -> sparkapplication, app.kubernetes.io/version -> 3.5.8-stackable0.0.0-dev, prometheus.io/scrape -> true, spark-app-name -> pyspark-pi, spark-app-selector -> spark-1f908fee91404eff86916ea993022d70, spark-role -> driver, spark-version -> 3.5.8-stackable0.0.0-dev, stackable.tech/vendor -> Stackable
	 pod uid: 2377186b-252f-4ca4-8ad4-fa25b2762379
	 creation time: 2026-03-11T14:07:49Z
	 service account name: pyspark-pi
	 volumes: log, log-config, config, pod-template-volume, spark-local-dir-1, spark-conf-volume-driver, kube-api-access-zv7dj
	 node name: kind-control-plane
	 start time: 2026-03-11T14:07:49Z
	 phase: Running
	 container status: 
		 container name: spark
		 container image: oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev
		 container state: terminated
		 container started at: 2026-03-11T14:07:50Z
		 container finished at: 2026-03-11T14:08:00Z
		 exit code: 0
		 termination reason: Completed
26/03/11 14:08:02 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Running)
26/03/11 14:08:02 INFO LoggingPodStatusWatcherImpl: State changed, new state: 
	 pod name: pyspark-pi-1215d79cdd39a2b4-driver
	 namespace: default
	 labels: app.kubernetes.io/component -> spark, app.kubernetes.io/instance -> pyspark-pi, app.kubernetes.io/managed-by -> spark.stackable.tech_sparkapplication, app.kubernetes.io/name -> spark-k8s, app.kubernetes.io/role-group -> sparkapplication, app.kubernetes.io/version -> 3.5.8-stackable0.0.0-dev, prometheus.io/scrape -> true, spark-app-name -> pyspark-pi, spark-app-selector -> spark-1f908fee91404eff86916ea993022d70, spark-role -> driver, spark-version -> 3.5.8-stackable0.0.0-dev, stackable.tech/vendor -> Stackable
	 pod uid: 2377186b-252f-4ca4-8ad4-fa25b2762379
	 creation time: 2026-03-11T14:07:49Z
	 service account name: pyspark-pi
	 volumes: log, log-config, config, pod-template-volume, spark-local-dir-1, spark-conf-volume-driver, kube-api-access-zv7dj
	 node name: kind-control-plane
	 start time: 2026-03-11T14:07:49Z
	 phase: Succeeded
	 container status: 
		 container name: spark
		 container image: oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev
		 container state: terminated
		 container started at: 2026-03-11T14:07:50Z
		 container finished at: 2026-03-11T14:08:00Z
		 exit code: 0
		 termination reason: Completed
26/03/11 14:08:02 INFO LoggingPodStatusWatcherImpl: Application status for spark-1f908fee91404eff86916ea993022d70 (phase: Succeeded)
26/03/11 14:08:02 INFO LoggingPodStatusWatcherImpl: Container final statuses:


	 container name: spark
	 container image: oci.stackable.tech/sdp/spark-k8s:3.5.8-stackable0.0.0-dev
	 container state: terminated
	 container started at: 2026-03-11T14:07:50Z
	 container finished at: 2026-03-11T14:08:00Z
	 exit code: 0
	 termination reason: Completed
26/03/11 14:08:02 INFO LoggingPodStatusWatcherImpl: Application pyspark-pi with application ID spark-1f908fee91404eff86916ea993022d70 and submission ID default:pyspark-pi-1215d79cdd39a2b4-driver finished
26/03/11 14:08:02 INFO ShutdownHookManager: Shutdown hook called
26/03/11 14:08:02 INFO ShutdownHookManager: Deleting directory /tmp/spark-dbfe7830-5854-4d77-9444-3f85f9a9e698
26/03/11 14:08:02 INFO ShutdownHookManager: Deleting directory /tmp/spark-9cb35d08-4b22-491b-85b8-8fbb53ec0ed0
26/03/11 14:08:02 INFO ShutdownHookManager: Deleting directory /tmp/spark-e1487c6e-fcad-488f-a6b5-e6addccca42a

@NickLarsenNZ NickLarsenNZ moved this from Development: In Progress to Development: Waiting for Review in Stackable Engineering Mar 11, 2026
@NickLarsenNZ NickLarsenNZ marked this pull request as ready for review March 11, 2026 15:29
Copy link
Member

@razvan razvan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@sbernauer sbernauer moved this from Development: Waiting for Review to Development: In Review in Stackable Engineering Mar 12, 2026
@NickLarsenNZ NickLarsenNZ added this pull request to the merge queue Mar 12, 2026
@NickLarsenNZ NickLarsenNZ moved this from Development: In Review to Development: Track in Stackable Engineering Mar 12, 2026
@NickLarsenNZ NickLarsenNZ moved this from Development: Track to Development: Done in Stackable Engineering Mar 12, 2026
Merged via the queue into main with commit 41068d9 Mar 12, 2026
10 checks passed
@NickLarsenNZ NickLarsenNZ deleted the fix/getting-started-pre-26.3.0 branch March 12, 2026 09:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Status: Development: Done

Development

Successfully merging this pull request may close these issues.

3 participants