Yuming Wang created SPARK-44651:
-----------------------------------

             Summary: Make do-release-docker.sh compatible with Mac m2
                 Key: SPARK-44651
                 URL: https://issues.apache.org/jira/browse/SPARK-44651
             Project: Spark
          Issue Type: Improvement
          Components: Project Infra
    Affects Versions: 4.0.0
            Reporter: Yuming Wang


How to test:
{code:sh}
dev/create-release/do-release-docker.sh -d /Users/yumwang/release-spark/output 
-s docs -n
{code}


Install python3-dev and build-essential:
{code:sh}
$APT_INSTALL python-is-python3 python3-pip python3-setuptools python3-dev 
build-essential
{code}

{noformat}
Collecting grpcio==1.56.0
  Downloading grpcio-1.56.0.tar.gz (24.3 MB)
     |████████████████████████████████| 24.3 MB 6.7 MB/s 
    ERROR: Command errored out with exit status 1:
     command: /usr/bin/python3 -c 'import sys, setuptools, tokenize; 
sys.argv[0] = '"'"'/tmp/pip-install-qmfpon02/grpcio/setup.py'"'"'; 
__file__='"'"'/tmp/pip-install-qmfpon02/grpcio/setup.py'"'"';f=getattr(tokenize,
 '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', 
'"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info 
--egg-base /tmp/pip-install-qmfpon02/grpcio/pip-egg-info
         cwd: /tmp/pip-install-qmfpon02/grpcio/
    Complete output (11 lines):
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmp/pip-install-qmfpon02/grpcio/setup.py", line 263, in <module>
        if check_linker_need_libatomic():
      File "/tmp/pip-install-qmfpon02/grpcio/setup.py", line 210, in 
check_linker_need_libatomic
        cpp_test = subprocess.Popen(cxx + ['-x', 'c++', '-std=c++14', '-'],
      File "/usr/lib/python3.8/subprocess.py", line 858, in __init__
        self._execute_child(args, executable, preexec_fn, close_fds,
      File "/usr/lib/python3.8/subprocess.py", line 1704, in _execute_child
        raise child_exception_type(errno_num, err_msg, err_filename)
    FileNotFoundError: [Errno 2] No such file or directory: 'c++'
    ----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check 
the logs for full command output.
...

  Could not find <Python.h>. This could mean the following:
    * You're on Ubuntu and haven't run `apt-get install python3-dev`.
    * You're on RHEL/Fedora and haven't run `yum install python3-devel` or
      `dnf install python3-devel` (make sure you also have redhat-rpm-config
      installed)
    * You're on Mac OS X and the usual Python framework was somehow corrupted
      (check your environment variables or try re-installing?)
    * You're on Windows and your Python installation was somehow corrupted
      (check your environment variables or try re-installing?)
{noformat}



{noformat}
#5 848.0 Successfully built grpcio future
#5 848.0 Failed to build pyarrow
#5 848.7 ERROR: Could not build wheels for pyarrow which use PEP 517 and cannot 
be installed directly
{noformat}


{noformat}
root@c57ec74c8d32:/# $APT_INSTALL r-base r-base-dev
Reading package lists... Done
Building dependency tree
Reading state information... Done
Some packages could not be installed. This may mean that you have
requested an impossible situation or if you are using the unstable
distribution that some required packages have not yet been created
or been moved out of Incoming.
The following information may help to resolve the situation:

The following packages have unmet dependencies:
 r-base : Depends: r-base-core (>= 4.3.1-3.2004.0) but it is not going to be 
installed
          Depends: r-recommended (= 4.3.1-3.2004.0) but it is not going to be 
installed
 r-base-dev : Depends: r-base-core (>= 4.3.1-3.2004.0) but it is not going to 
be installed
E: Unable to correct problems, you have held broken packages.
{noformat}






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to