FIX ERROR — Python2: PIP fails after upgrade

After upgrading PIP to CentOS 6, which still uses Python2.7, PIP for each command crashes with an error.

For example:

pip install --upgrade pip
Collecting pip
  Downloading https://files.pythonhosted.org/packages/88/d9/761f0b1e0551a3559afe4d34bd9bf68fc8de3292363b3775dda39b62ce84/pip-22.0.3.tar.gz (2.1MB)
    100% |████████████████████████████████| 2.1MB 544kB/s
    Complete output from command python setup.py egg_info:
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmp/pip-build-snCNSf/pip/setup.py", line 7
        def read(rel_path: str) -> str:
                         ^
    SyntaxError: invalid syntax

    ----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-snCNSf/pip/
You are using pip version 8.1.2, however version 22.0.3 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.

 

Reason:

PIP versions above 20.3 do not support Python2.7

Solution:

Install latest supported version

pip install --upgrade pip==20.3

 

 Linux – Get path and filename from full path

 

To extract a file path or filename from a full path, you can use various utilities like grep, sed, awk, etc. If it’s not a list from a file or variable, you can use find, but there’s an easier way:

  • basename – returns the filename
  • dirname – returns the path to the file

Example:

basename /home/artem/file.txt
file.txt

dirname /home/artem/file.txt
/home/artem

 

These two utilities are included in the coreutils package

 S3 – Mounting in Linux

In order to mount an S3 Bucket as a file system, you need to install s3fs

Create a directory to mount:

mkdir -p /mnt/s3

 

And add the following to "/etc/fstab":

artem-service-bucket:/upload/ /mnt/s3 fuse.s3fs _netdev,rw,nosuid,nodev,allow_other,nonempty,iam_role,umask=022,url=https://s3.eu-central-1.amazonaws.com,endpoint=eu-central-1 0 0

 

Where:

  • "artem-service-bucket:/upload/" – S3 bucket name and the directory inside the bucket to mount
  • "url=https://s3.eu-central-1.amazonaws.com,endpoint=eu-central-1" – the region where the S3 bucket is located
  • "iam_role" – indicate that we will use the IAM Role for authentication

 

Mount:

mount -a

 

 

 Terraform – AWS Secrets Manager: Retrieve RDS login/password

It is necessary to extract the login and password from RDS, which are stored in AWS Secret Manager and use their values in the Terraform code. To do this, you can use the following construction:

# Should be there before the apply
data "aws_secretsmanager_secret" "rds-admin-user" {
  name  = "/ARTEM-SERVICES/PROD/RDS/CREDENTIALS"
}

data "aws_secretsmanager_secret_version" "rds-admin-user" {
  secret_id = data.aws_secretsmanager_secret.rds-admin-user.id
}

locals {
  additional_rds_username      = jsondecode(data.aws_secretsmanager_secret_version.rds-admin-user.secret_string)["username"]
  additional_rds_user_password = jsondecode(data.aws_secretsmanager_secret_version.rds-admin-user.secret_string)["password"]
}

 

And use variables:

local.additional_rds_username
local.additional_rds_user_password

 

 

 Terraform – AWS SSM: Extract content

The SSM Parameter Store contains the following JSON:

{
  "username": "admin",
  "password": "password"
}

 

It is necessary to extract the login and password, and use their values in the Terraform code. To do this, you can use the following construction:

# Should be there before the apply
data "aws_ssm_parameter" "rds-admin-user" {
  name  = "/ARTEM-SERVICES/PROD/RDS/CREDENTIALS"
}

locals {
  additional_rds_username      = jsondecode(data.aws_ssm_parameter.rds-admin-user.value)["username"]
  additional_rds_user_password = jsondecode(data.aws_ssm_parameter.rds-admin-user.value)["password"]
}

 

And use variables:

local.additional_rds_username
local.additional_rds_user_password

 

 

 OpenVPN – Exclude specific IPs or networks from routes

In order to exclude a specific range or IP address, you need to add the parameter "net_gateway".

For example, it is necessary that the network "10.0.0.0/8" is routed through the VPN, but at the same time the network "10.0.1.0/24" is excluded from the route, the entry in the configuration file will look like this:

push "route 10.0.0.0 255.0.0.0"
push "route 10.0.1.0 255.255.255.0 net_gateway"

 

 

 FIX ERROR – Ubuntu: /etc/resolv.conf is not a symbolic link to /run/resolvconf/resolv.conf

Similar error:

/etc/resolvconf/update.d/libc: Warning: /etc/resolv.conf is not a symbolic link to /run/resolvconf/resolv.conf

 

It may indicate that the symbolic link "/etc/resolv.conf" is missing, in order to fix this, you can create it manually:

sudo ln -s /run/resolvconf/resolv.conf /etc/resolv.conf
sudo resolvconf -u

 

Alternatively, run a "resolvconf" reconfiguration using the following command:

sudo dpkg-reconfigure resolvconf

 Jenkins – Python VirtualEnv with version selection

 

To select a Python version in the pipeline, you need to have the required versions installed on the system.

Further actions were performed on CentOS 7 and the installation of binaries took place in the "/usr/bin/" directory for convenience, since the system already has versions "2.7" and "3.6" installed from the repository along this path.

Install dependencies:

yum install gcc openssl-devel bzip2-devel libffi-devel wget

 

Download the necessary sources of the necessary versions, in this case: "3.7", "3.8" and "3.9"

cd /usr/src

wget https://www.python.org/ftp/python/3.7.9/Python-3.7.9.tgz
wget https://www.python.org/ftp/python/3.8.9/Python-3.8.9.tgz
wget https://www.python.org/ftp/python/3.9.5/Python-3.9.5.tgz

 

Unzip:

tar xzf Python-3.7.9.tgz
tar xzf Python-3.8.9.tgz
tar xzf Python-3.9.5.tgz

 

Install:

cd Python-3.7.9
./configure --enable-optimizations --prefix=/usr
make altinstall

cd ../Python-3.8.9
./configure --enable-optimizations --prefix=/usr
make altinstall

cd ../Python-3.9.5
./configure --enable-optimizations --prefix=/usr
make altinstall

 

Now let’s install the plugin Pyenv Pipeline

Go to Jenkins settings

 

Section "Manage Plugins"

 

Go to the "Available" tab and in the search we specify "Pyenv Pipeline"

And install it.

 

To select the version, we will use the "choice" parameter

Pipeline:

properties([
  parameters([
    choice(
      name: 'PYTHON',
      description: 'Choose Python version',
      choices: ["python2.7", "python3.6", "python3.7", "python3.8", "python3.9"].join("\n")
    ),
    base64File(
      name: 'REQUIREMENTS_FILE',
      description: 'Upload requirements file (Optional)'
    )
  ])
])

pipeline {
  agent any
  options {
    buildDiscarder(logRotator(numToKeepStr: '5'))
    timeout(time: 60, unit:'MINUTES')
    timestamps()
  }
  stages {
    stage("Python"){
      steps{
        withPythonEnv("/usr/bin/${params.PYTHON}") {
          script {
            if ( env.REQUIREMENTS_FILE.isEmpty() ) {
              sh "python --version"
              sh "pip --version"
              sh "echo Requirements file not set. Run Python without requirements file."
            }
            else {
              sh "python --version"
              sh "pip --version"
              sh "echo Requirements file found. Run PIP install using requirements file."
              withFileParameter('REQUIREMENTS_FILE') {
                sh 'cat $REQUIREMENTS_FILE > requirements.txt'
              }
              sh "pip install -r requirements.txt"
            }
          }
        }
      }
    }
  }
}

 

Let’s start the build:

Select the desired version, for example, "3.9" and run the build:

Let’s check the build log: