Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transition from numpy.distutils to scikit-build #3

Merged
merged 43 commits into from
Sep 24, 2024
Merged
Show file tree
Hide file tree
Changes from 41 commits
Commits
Show all changes
43 commits
Select commit Hold shift + click to select a range
1be1659
Transition from numpy.distutils to scikit-build
AndresGuzman-Ballen Jul 28, 2023
ad03590
Add scikit-build support for Windows platform
AndresGuzman-Ballen Aug 16, 2023
c62b5d7
Remove stray leftover file
oleksandr-pavlyk Sep 2, 2023
0319b43
Add conda-recipe, a GH action workflow
oleksandr-pavlyk Sep 5, 2023
5c97883
Merge remote-tracking branch 'origin/main' into transition-to-scikit-…
oleksandr-pavlyk Sep 5, 2023
0350a29
Updated instructions to build from source using ICX
oleksandr-pavlyk Sep 5, 2023
c2bbcd3
Try using /FORCE:UNRESOLVED for MSVC linker
oleksandr-pavlyk Sep 5, 2023
ef16268
Use multiple linker options
oleksandr-pavlyk Sep 5, 2023
fd833e5
Updated copyright year to 2023
oleksandr-pavlyk Sep 5, 2023
589046d
Link mkl_umath_loops to Python lib
oleksandr-pavlyk Sep 5, 2023
e80e710
No need to export all symbols
oleksandr-pavlyk Sep 5, 2023
effe815
Removed stray hard path
oleksandr-pavlyk Sep 5, 2023
bb87659
Ensure symbols are property annotated for export
oleksandr-pavlyk Sep 5, 2023
e3db203
Specify ARCHIVE/RUNTIME/LIBRARY destinations for mkl_umath on Windows
oleksandr-pavlyk Sep 5, 2023
fe29ae6
Use vendored copy of conv_template script
oleksandr-pavlyk Sep 11, 2023
2c87ff7
Removed duplicate import line
oleksandr-pavlyk Jan 8, 2024
de82deb
Removed hard-coded paths, updated to CMake 3.27
oleksandr-pavlyk Jan 8, 2024
ea90d0f
Changes to permit vectorization of most loops by ICX
oleksandr-pavlyk Jan 10, 2024
9d1e3a3
add c99 standard
ekomarova Jun 11, 2024
e26ba4d
add high precision flags
ekomarova Jun 12, 2024
87005c1
Merge pull request #5 from ekomarova/transition-to-scikit-build
ekomarova Jun 12, 2024
2d24480
replace test_basic with pytest
ekomarova Jun 17, 2024
41eec52
Merge pull request #6 from ekomarova/transition-to-scikit-build
ekomarova Jun 17, 2024
372bf68
convert the generated integer to numpy.int64 using type
ekomarova Jun 18, 2024
cad9701
Merge pull request #7 from ekomarova/transition-to-scikit-build
ekomarova Jun 18, 2024
2cc4dd6
Changes to enable compilation with NumPy 2
oleksandr-pavlyk Sep 11, 2024
3377d38
Provide w/a for ICC and recent libmmd library
oleksandr-pavlyk Sep 12, 2024
ea9ef22
Find NumPy as Python component
oleksandr-pavlyk Sep 17, 2024
8fcf5e4
_patch is to use language_level=3
oleksandr-pavlyk Sep 17, 2024
826a330
Replace use of -c intel channel, replace use of -c main
oleksandr-pavlyk Sep 18, 2024
aa3876f
Fix issue with Windows build/test steps
oleksandr-pavlyk Sep 18, 2024
3025b1b
Use Windows-2019 container over windows-latest
oleksandr-pavlyk Sep 18, 2024
a98eae4
Add CODEOWNERS file
oleksandr-pavlyk Sep 23, 2024
2b584cf
Add dependabot file
oleksandr-pavlyk Sep 23, 2024
8ff6665
Add OpenSSF scorecard workflow
oleksandr-pavlyk Sep 23, 2024
a2d6182
Fixed upload of Windows build artifact
oleksandr-pavlyk Sep 23, 2024
5270b7c
Add a step to output content of workdir
oleksandr-pavlyk Sep 23, 2024
ff3bc26
Add SECURITY.md file
oleksandr-pavlyk Sep 23, 2024
535e68a
Attempt to fix test step for Windows
oleksandr-pavlyk Sep 24, 2024
2f0e053
Merge pull request #10 from IntelPython/add-openssf-scorecard
oleksandr-pavlyk Sep 24, 2024
030e162
Replace use of "-c intel" in the README.
oleksandr-pavlyk Sep 24, 2024
6c8c1db
Bump up versions of actions per GH warnings
oleksandr-pavlyk Sep 24, 2024
6e73a4c
Fix build.sh per review comment
oleksandr-pavlyk Sep 24, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/CODEOWNERS
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
* @oleksandr-pavlyk @xaleryb @ekomarova
6 changes: 6 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
290 changes: 290 additions & 0 deletions .github/workflows/conda-package.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,290 @@
name: Conda package

on: push

env:
PACKAGE_NAME: mkl_umath
MODULE_NAME: mkl_umath

jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
python: ['3.10', '3.11', '3.12']
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0

- name: Set pkgs_dirs
run: |
echo "pkgs_dirs: [~/.conda/pkgs]" >> ~/.condarc

- name: Cache conda packages
uses: actions/cache@v3
env:
CACHE_NUMBER: 0 # Increase to reset cache
with:
path: ~/.conda/pkgs
key:
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-python-${{ matrix.python }}-${{hashFiles('**/meta.yaml') }}
restore-keys: |
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-python-${{ matrix.python }}-
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-

- name: Add conda to system path
run: echo $CONDA/bin >> $GITHUB_PATH

- name: Install conda-build
run: conda install conda-build

- name: Build conda package
run: |
CHANNELS="-c conda-forge -c https://software.repos.intel.com/python/conda --override-channels"
VERSIONS="--python ${{ matrix.python }}"
TEST="--no-test"
echo "CONDA_BLD=${CONDA}/conda-bld/linux-64" >> $GITHUB_ENV

conda build \
$TEST \
$VERSIONS \
$CHANNELS \
conda-recipe-cf

- name: Upload artifact
uses: actions/upload-artifact@v3
with:
name: ${{ env.PACKAGE_NAME }} ${{ runner.os }} Python ${{ matrix.python }}
path: ${{ env.CONDA_BLD }}/${{ env.PACKAGE_NAME }}-*.tar.bz2

test:
needs: build
runs-on: ${{ matrix.runner }}

strategy:
matrix:
python: ['3.10', '3.11', '3.12']
experimental: [false]
runner: [ubuntu-latest]
continue-on-error: ${{ matrix.experimental }}
env:
CHANNELS: -c conda-forge -c https://software.repos.intel.com/python/conda --override-channels
oleksandr-pavlyk marked this conversation as resolved.
Show resolved Hide resolved

steps:
- name: Download artifact
uses: actions/download-artifact@v3
with:
name: ${{ env.PACKAGE_NAME }} ${{ runner.os }} Python ${{ matrix.python }}
- name: Add conda to system path
run: echo $CONDA/bin >> $GITHUB_PATH
- name: Install conda-build
run: conda install conda-build
- name: Create conda channel
run: |
mkdir -p $GITHUB_WORKSPACE/channel/linux-64
mv ${PACKAGE_NAME}-*.tar.bz2 $GITHUB_WORKSPACE/channel/linux-64
conda index $GITHUB_WORKSPACE/channel
# Test channel
conda search $PACKAGE_NAME -c $GITHUB_WORKSPACE/channel --override-channels

- name: Collect dependencies
run: |
CHANNELS="-c $GITHUB_WORKSPACE/channel ${{ env.CHANNELS }}"
conda create -n test_mkl_umath $PACKAGE_NAME python=${{ matrix.python }} $CHANNELS --only-deps --dry-run > lockfile
- name: Display lockfile
run: cat lockfile

- name: Set pkgs_dirs
run: |
echo "pkgs_dirs: [~/.conda/pkgs]" >> ~/.condarc

- name: Cache conda packages
uses: actions/cache@v3
env:
CACHE_NUMBER: 0 # Increase to reset cache
with:
path: ~/.conda/pkgs
key:
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-python-${{ matrix.python }}-${{hashFiles('lockfile') }}
restore-keys: |
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-python-${{ matrix.python }}-
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-

- name: Install mkl_umath
run: |
CHANNELS="-c $GITHUB_WORKSPACE/channel ${{ env.CHANNELS }}"
conda create -n test_mkl_umath python=${{ matrix.python }} $PACKAGE_NAME pytest $CHANNELS
# Test installed packages
conda list -n test_mkl_umath

- name: Run tests
run: |
source $CONDA/etc/profile.d/conda.sh
conda activate test_mkl_umath
python -c "import mkl_umath, numpy as np; mkl_umath.use_in_numpy(); np.sin(np.linspace(0, 1, num=10**6));"

build_windows:
runs-on: windows-2019

strategy:
matrix:
python: ['3.10', '3.11', '3.12']
env:
conda-bld: C:\Miniconda\conda-bld\win-64\
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0

- uses: conda-incubator/setup-miniconda@v3
with:
miniforge-variant: Miniforge3
miniforge-version: latest
activate-environment: build
channels: conda-forge
python-version: ${{ matrix.python }}

- name: Cache conda packages
uses: actions/cache@v3
env:
CACHE_NUMBER: 3 # Increase to reset cache
with:
path: /home/runner/conda_pkgs_dir
key:
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-python-${{ matrix.python }}-${{hashFiles('**/meta.yaml') }}
restore-keys: |
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-python-${{ matrix.python }}-
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-

- name: Store conda paths as envs
shell: bash -l {0}
run: |
echo "CONDA_BLD=$CONDA/conda-bld/win-64/" | tr "\\\\" '/' >> $GITHUB_ENV

- name: Install conda build
run: |
conda activate
conda install -y conda-build
conda list -n base

- name: Build conda package
run: |
conda activate
conda build --no-test --python ${{ matrix.python }} -c conda-forge -c https://software.repos.intel.com/python/conda --override-channels conda-recipe-cf

- name: Upload artifact
uses: actions/upload-artifact@v3
with:
name: ${{ env.PACKAGE_NAME }} ${{ runner.os }} Python ${{ matrix.python }}
path: ${{ env.CONDA_BLD }}${{ env.PACKAGE_NAME }}-*.tar.bz2

test_windows:
needs: build_windows
runs-on: ${{ matrix.runner }}
defaults:
run:
shell: cmd /C CALL {0}
strategy:
matrix:
python: ['3.10', '3.11', '3.12']
experimental: [false]
runner: [windows-2019]
continue-on-error: ${{ matrix.experimental }}
env:
workdir: '${{ github.workspace }}'
CHANNELS: -c conda-forge -c https://software.repos.intel.com/python/conda --override-channels

steps:
- name: Download artifact
uses: actions/download-artifact@v3
with:
name: ${{ env.PACKAGE_NAME }} ${{ runner.os }} Python ${{ matrix.python }}

- uses: conda-incubator/setup-miniconda@v3
with:
auto-update-conda: true
conda-build-version: '*'
miniforge-variant: Miniforge3
miniforge-version: latest
activate-environment: mkl_umath_test
channels: conda-forge
python-version: ${{ matrix.python }}

- name: Create conda channel with the artifact bit
shell: cmd /C CALL {0}
run: |
echo ${{ env.workdir }}
mkdir ${{ env.workdir }}\channel\win-64
move ${{ env.PACKAGE_NAME }}-*.tar.bz2 ${{ env.workdir }}\channel\win-64
dir ${{ env.workdir }}\channel\win-64

- name: Index the channel
shell: cmd /C CALL {0}
run: |
conda index ${{ env.workdir }}\channel

- name: Dump mkl_umath version info from created channel to STDOUT
shell: cmd /C CALL {0}
run: |
conda search ${{ env.PACKAGE_NAME }} -c ${{ env.workdir }}/channel --override-channels --info --json
- name: Dump mkl_umath version info from created channel into ver.json
shell: cmd /C CALL {0}
run: |
conda search ${{ env.PACKAGE_NAME }} -c ${{ env.workdir }}/channel --override-channels --info --json > ${{ env.workdir }}\ver.json
- name: Output content of workdir
shell: pwsh
run: Get-ChildItem -Path ${{ env.workdir }}
- name: Output content of produced ver.json
shell: pwsh
run: Get-Content -Path ${{ env.workdir }}\ver.json
- name: Collect dependencies
shell: cmd /C CALL {0}
run: |
IF NOT EXIST ver.json (
copy /Y ${{ env.workdir }}\ver.json .
)
SET "SCRIPT=%VER_SCRIPT1% %VER_SCRIPT2%"
FOR /F "tokens=* USEBACKQ" %%F IN (`python -c "%SCRIPT%"`) DO (
SET PACKAGE_VERSION=%%F
)
conda install -n mkl_umath_test ${{ env.PACKAGE_NAME }}=%PACKAGE_VERSION% python=${{ matrix.python }} -c ${{ env.workdir }}/channel ${{ env.CHANNELS }} --only-deps --dry-run > lockfile
- name: Display lockfile content
shell: pwsh
run: Get-Content -Path .\lockfile
- name: Cache conda packages
uses: actions/cache@v3
env:
CACHE_NUMBER: 0 # Increase to reset cache
with:
path: /home/runner/conda_pkgs_dir
key:
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-python-${{ matrix.python }}-${{hashFiles('lockfile') }}
restore-keys: |
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-python-${{ matrix.python }}-
${{ runner.os }}-conda-${{ env.CACHE_NUMBER }}-
- name: Install mkl_umath
shell: cmd /C CALL {0}
run: |
@ECHO ON
IF NOT EXIST ver.json (
copy /Y ${{ env.workdir }}\ver.json .
)
set "SCRIPT=%VER_SCRIPT1% %VER_SCRIPT2%"
FOR /F "tokens=* USEBACKQ" %%F IN (`python -c "%SCRIPT%"`) DO (
SET PACKAGE_VERSION=%%F
)
SET "TEST_DEPENDENCIES=pytest pytest-cov"
conda install -n mkl_umath_test ${{ env.PACKAGE_NAME }}=%PACKAGE_VERSION% %TEST_DEPENDENCIES% python=${{ matrix.python }} -c ${{ env.workdir }}/channel ${{ env.CHANNELS }}
- name: Report content of test environment
shell: cmd /C CALL {0}
run: |
conda activate
echo "Value of CONDA enviroment variable was: " %CONDA%
echo "Value of CONDA_PREFIX enviroment variable was: " %CONDA_PREFIX%
conda info && conda list -n mkl_umath_test
- name: Run tests
shell: cmd /C CALL {0}
run: >-
conda activate mkl_umath_test && python -c "import mkl_umath, numpy as np; mkl_umath.use_in_numpy(); np.sin(np.linspace(0, 1, num=10**6));"

74 changes: 74 additions & 0 deletions .github/workflows/openssf-scorecard.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
# This workflow uses actions that are not certified by GitHub. They are provided
# by a third-party and are governed by separate terms of service, privacy
# policy, and support documentation.

name: Scorecard supply-chain security
on:
# For Branch-Protection check. Only the default branch is supported. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection
branch_protection_rule:
# To guarantee Maintained check is occasionally updated. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#maintained
schedule:
- cron: '28 2 * * 1'
- cron: '28 2 * * 4'
push:
branches: [ "master" ]

# Declare default permissions as read only.
permissions: read-all

jobs:
analysis:
name: Scorecard analysis
runs-on: ubuntu-latest
timeout-minutes: 30
permissions:
# Needed to upload the results to code-scanning dashboard.
security-events: write
# Needed to publish results and get a badge (see publish_results below).
id-token: write
# Uncomment the permissions below if installing in a private repository.
# contents: read
# actions: read

steps:
- name: "Checkout code"
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4.1.7
with:
persist-credentials: false

- name: "Run analysis"
uses: ossf/scorecard-action@62b2cac7ed8198b15735ed49ab1e5cf35480ba46 # v2.4.0
with:
results_file: results.sarif
results_format: sarif
# (Optional) "write" PAT token. Uncomment the `repo_token` line below if:
# - you want to enable the Branch-Protection check on a *public* repository, or
# - you are installing Scorecard on a *private* repository
# To create the PAT, follow the steps in https://github.com/ossf/scorecard-action#authentication-with-pat.
# repo_token: ${{ secrets.SCORECARD_TOKEN }}

# Public repositories:
# - Publish results to OpenSSF REST API for easy access by consumers
# - Allows the repository to include the Scorecard badge.
# - See https://github.com/ossf/scorecard-action#publishing-results.
# For private repositories:
# - `publish_results` will always be set to `false`, regardless
# of the value entered here.
publish_results: true

# Upload the results as artifacts (optional). Commenting out will disable uploads of run results in SARIF
# format to the repository Actions tab.
- name: "Upload artifact"
uses: actions/upload-artifact@50769540e7f4bd5e21e526ee35c689e35e0d6874 # v4.4.0
with:
name: SARIF file
path: results.sarif
retention-days: 14

# Upload the results to GitHub's code scanning dashboard.
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@294a9d92911152fe08befb9ec03e240add280cb3 # v3.26.8
with:
sarif_file: results.sarif
Loading
Loading