why does pyspark give "we couldn't find any external ip address" on macos?

  • Last Update :
  • Techknowledgy :

You can explicitly set the local hostname to localhost (which is probably resolvable) for spark-shell using SPARK_LOCAL_HOSTNAME environment variable as follows:

SPARK_LOCAL_HOSTNAME = localhost. / bin / spark - shell

Or:

. / bin / spark - shell - c spark.driver.host = localhost

Add local path in $SPARK_HOME/conf/spark-env.sh:

SPARK_LOCAL_IP = 127.0 .0 .1

Suggestion : 2

I use pyspark and got a warning below. Could someone tell me how to fix it? Is this something I should be worried about?,Why Does Pyspark Give We Couldnt Find Any External Ip Address On Macos, 2 days ago The Spark shell and spark-submit tool support two ways to load configurations dynamically. The first is command line options, such as --master, as shown above. spark-submit can accept any … , If it doesn’t find it, then the Python code is not recognized as an internal or external command, operable program or batch file message is displayed. It is not common for users programming on computers such as a Mac to see this error, since Python comes pre-installed on macOS.


lines = sc.textFile("README.md") #worked lines.count() #error
SPARK_LOCAL_HOSTNAME = localhost. / bin / spark - shell
. / bin / spark - shell - c spark.driver.host = localhost

Suggestion : 3

Aug 2, 2022 , Start date Aug 2, 2022

c: \ > spark - submit--jars = "C:\Spark\hadoop\share\hadoop\common\lib\hadoop-aws-2.7.1.jar,C:\Spark\hadoop\share\hadoop\common\lib\aws-java-sdk-1.7.4.jar"--verbose--deploy - mode cluster--master spark: //127.0.0.1:7077 --class FileInputRename c:\sparkSubmit\sparkSubmit_NoJarSetInConf.jar "s3://bucket/jar/fileInputRename.txt"
    Using properties file: C: \Spark\ bin\..\conf\ spark - defaults.conf
    Parsed arguments:
       master spark: //127.0.0.1:7077
       deployMode cluster
    executorMemory null
    executorCores null
    totalExecutorCores null
    propertiesFile C: \Spark\ bin\..\conf\ spark - defaults.conf
    driverMemory null
    driverCores null
    driverExtraClassPath null
    driverExtraLibraryPath null
    driverExtraJavaOptions null
    supervise false
    queue null
    numExecutors null
    files null
    pyFiles null
    archives null
    mainClass FileInputRename
    primaryResource file: /c:/sparkSubmit / sparkSubmit_NoJarSetInConf.jar
    name FileInputRename
    childArgs[s3: //SessionCam-Steve/jar/fileInputRename.txt]
          jars file: /C:/Spark / hadoop / share / hadoop / common / lib / hadoop - aws - 2.7 .1.jar, file: /C:/Spark / hadoop / share / hadoop / common / lib / aws - java - sdk - 1.7 .4.jar packages null packagesExclusions null repositories null verbose true

          Spark properties used, including those specified through
          --conf and those from the properties file C: \Spark\ bin\..\conf\ spark - defaults.conf:

          Running Spark using the REST application submission protocol.Main class:
          org.apache.spark.deploy.rest.RestSubmissionClient Arguments:
          file: /c:/sparkSubmit / sparkSubmit_NoJarSetInConf.jar FileInputRename s3: //SessionCam-Steve/jar/fileInputRename.txt
          System properties:
          SPARK_SUBMIT - > true spark.driver.supervise - > false spark.app.name - > FileInputRename spark.jars - > file: /C:/Spark / hadoop / share / hadoop / common / lib / hadoop - aws - 2.7 .1.jar, file: /C:/Spark / hadoop / share / hadoop / common / lib / aws - java - sdk - 1.7 .4.jar, file: /c:/sparkSubmit / sparkSubmit_NoJarSetInConf.jar spark.submit.deployMode - > cluster spark.master - > spark: //127.0.0.1:7077
          Classpath elements:

          16 / 03 / 24 12: 01: 56 INFO rest.RestSubmissionClient: Submitting a request to launch an application in spark: //127.0.0.1:7077.
Application ID Name Cores Memory per Node Submitted Time User State Duration
app - 20160324120221 - 0016 FileInputRename 1 1024.0 MB 2016 / 03 / 24 12: 02: 21 Administrator FINISHED 3 s
    16 / 03 / 24 12: 02: 24 INFO spark.SecurityManager: Changing view acls to: Administrator
    16 / 03 / 24 12: 02: 24 INFO spark.SecurityManager: Changing modify acls to: Administrator
    16 / 03 / 24 12: 02: 24 INFO spark.SecurityManager: SecurityManager: authentication disabled;
    ui acls disabled;
    users with view permissions: Set(Administrator);
    users with modify permissions: Set(Administrator)
Application ID Name Cores Memory per Node Submitted Time User State Duration
app - 20160324120600 - 0018 FileInputRename 2 1024.0 MB 2016 / 03 / 24 12: 06: 00 Administrator FINISHED 9 s
app - 20160324120543 - 0017 FileInputRename 2 1024.0 MB 2016 / 03 / 24 12: 05: 43 Administrator FINISHED 8 s