openvino/openvino.spec
2025-05-19 00:47:09 +00:00

358 lines
9.6 KiB
RPMSpec
Raw Permalink Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

#
# Copyright (C) 2024 Intel Corporation
#
# Please submit issues or comments https://github.com/openvinotoolkit/openvino/issues
%define shlib lib%{name}
%define prj_name OpenVINO
%global openeuler_release 2
Name: openvino
Version: 2024.3.0
Release: %{openeuler_release}%{?dist}
Summary: A toolkit for optimizing and deploying AI inference
License: Apache-2.0 AND BSD-2-Clause AND BSD-3-Clause AND HPND AND JSON AND MIT AND OFL-1.1 AND Zlib
URL: https://github.com/openvinotoolkit/openvino
# The source tarball contains openvino all submodules initialization and update
# $ git clone --depth 1 --branch 2024.3.0 https://github.com/openvinotoolkit/openvino.git
# $ cd openvino && git submodule update --init recursive
Source0: %{name}-%{version}.tar.gz
#These packages are only for x86_64 and aarch64
ExclusiveArch: x86_64 aarch64
BuildRequires: cmake
BuildRequires: fdupes
BuildRequires: gcc-c++
BuildRequires: tbb-devel
BuildRequires: pkgconfig
BuildRequires: zstd
BuildRequires: opencl-headers
BuildRequires: pkgconfig(snappy)
BuildRequires: scons
BuildRequires: libatomic
%description
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
#
%package -n %{shlib}
Summary: Shared library for OpenVINO toolkit
%description -n %{shlib}
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the shared library for OpenVINO.
#
%package -n lib%{name}-devel
Summary: Headers and sources for OpenVINO toolkit
Requires: %{shlib} = %{version}
Requires: lib%{name}_ir_frontend = %{version}
Requires: lib%{name}_onnx_frontend = %{version}
Requires: lib%{name}_paddle_frontend = %{version}
Requires: lib%{name}_pytorch_frontend = %{version}
Requires: lib%{name}_tensorflow_frontend = %{version}
Requires: lib%{name}_tensorflow_lite_frontend = %{version}
Requires: tbb-devel
Recommends: lib%{name}-auto-batch-plugin = %{version}
Recommends: lib%{name}-auto-plugin = %{version}
Recommends: lib%{name}-hetero-plugin-%{version} = %{version}
%ifarch x86_64
Recommends: lib%{name}-intel-cpu-plugin = %{version}
Recommends: lib%{name}-intel-gpu-plugin = %{version}
%endif
%ifarch aarch64
Recommends: lib%{name}-arm-cpu-plugin = %{version}
%endif
%description -n lib%{name}-devel
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the headers and sources for developing applications with
OpenVINO.
## Plugins ##
#
%package -n lib%{name}-auto-plugin
Summary: Auto / Multi software plugin for OpenVINO toolkit
%description -n lib%{name}-auto-plugin
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the Auto / Multi software plugin for OpenVINO.
#
%package -n lib%{name}-auto-batch-plugin
Summary: Automatic batch software plugin for OpenVINO toolkit
%description -n lib%{name}-auto-batch-plugin
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the automatic batch software plugin for OpenVINO.
#
%package -n lib%{name}-hetero-plugin
Summary: Hetero frontend for Intel OpenVINO toolkit
%description -n lib%{name}-hetero-plugin
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the hetero frontend for OpenVINO.
%ifarch x86_64
#
%package -n lib%{name}-intel-cpu-plugin
Summary: Intel CPU plugin for OpenVINO toolkit
%description -n lib%{name}-intel-cpu-plugin
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the intel CPU plugin for OpenVINO for %{x86_64} archs.
#
%package -n lib%{name}-intel-gpu-plugin
Summary: Intel GPU plugin for OpenVINO toolkit
Requires: ocl-icd
%description -n lib%{name}-intel-gpu-plugin
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the intel GPU plugin for OpenVINO for %{x86_64} archs.
%endif
%ifarch aarch64
#
%package -n lib%{name}-arm-cpu-plugin
Summary: ARM CPU plugin for OpenVINO toolkit
%description -n lib%{name}-arm-cpu-plugin
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the arm CPU plugin for OpenVINO for %{aarch} archs.
%endif
## Frontend shared libs ##
#
%package -n lib%{name}_ir_frontend
Summary: Paddle frontend for Intel OpenVINO toolkit
%description -n lib%{name}_ir_frontend
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the ir frontend for OpenVINO.
#
%package -n lib%{name}_onnx_frontend
Summary: Onnx frontend for OpenVINO toolkit
%description -n lib%{name}_onnx_frontend
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the onnx frontend for OpenVINO.
#
%package -n lib%{name}_paddle_frontend
Summary: Paddle frontend for Intel OpenVINO toolkit
%description -n lib%{name}_paddle_frontend
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the paddle frontend for OpenVINO.
#
%package -n lib%{name}_pytorch_frontend
Summary: PyTorch frontend for OpenVINO toolkit
%description -n lib%{name}_pytorch_frontend
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the pytorch frontend for OpenVINO.
#
%package -n lib%{name}_tensorflow_frontend
Summary: TensorFlow frontend for OpenVINO toolkit
%description -n lib%{name}_tensorflow_frontend
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the tensorflow frontend for OpenVINO.
#
%package -n lib%{name}_tensorflow_lite_frontend
Summary: TensorFlow Lite frontend for OpenVINO toolkit
%description -n lib%{name}_tensorflow_lite_frontend
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides the tensorflow-lite frontend for OpenVINO.
## Samples/examples ##
#
%package -n %{name}-samples
Summary: Samples for use with OpenVINO toolkit
BuildArch: noarch
%description -n %{name}-samples
OpenVINO is an open-source toolkit for optimizing and deploying AI inference.
This package provides some samples for use with openVINO.
%global debug_package %{nil}
#
%prep
# download source0
# wget -qO %{SOURCE0} https://user-repo.openeuler.openatom.cn/lfs-tar/openvino/openvino-2024.3.0.tar.gz
%autosetup -p1
%build
# Otherwise intel_cpu plugin declares an executable stack
%ifarch x86_64
%define build_ldflags -Wl,-z,noexecstack
%endif
%cmake \
-DCMAKE_CXX_STANDARD=17 \
-DBUILD_SHARED_LIBS=ON \
-DENABLE_OV_ONNX_FRONTEND=ON \
-DENABLE_OV_PADDLE_FRONTEND=ON \
-DENABLE_OV_PYTORCH_FRONTEND=ON \
-DENABLE_OV_IR_FRONTEND=ON \
-DENABLE_OV_TF_FRONTEND=ON \
-DENABLE_OV_TF_LITE_FRONTEND=ON \
-DENABLE_INTEL_NPU=OFF \
-DENABLE_JS=OFF \
-DENABLE_SYSTEM_TBB=ON \
-DBUILD_TYPE=Release \
-DCPACK_GENERATOR=RPM \
%{nil}
%make_build
%install
%make_install DESTDIR=%{buildroot}
# Unnecessary if we get our package dependencies and lib paths right!
rm -fr %{buildroot}%{_prefix}/install_dependencies \
%{buildroot}%{_prefix}/setupvars.sh
%fdupes %{buildroot}%{_datadir}/
# We do not use bundled thirdparty libs
rm -rf %{buildroot}%{_datadir}/licenses/*
rm -rf %{buildroot}%{_datadir}/doc
%ldconfig_scriptlets -n %{shlib}
%ldconfig_scriptlets -n lib%{name}_ir_frontend
%ldconfig_scriptlets -n lib%{name}_onnx_frontend
%ldconfig_scriptlets -n lib%{name}_paddle_frontend
%ldconfig_scriptlets -n lib%{name}_pytorch_frontend
%ldconfig_scriptlets -n lib%{name}_tensorflow_lite_frontend
%ldconfig_scriptlets -n lib%{name}_tensorflow_frontend
%files -n %{shlib}
%license LICENSE
%{_libdir}/libopenvino.so.*
%{_libdir}/libopenvino_c.so.*
%files -n lib%{name}-auto-batch-plugin
%dir %{_libdir}/%{name}-%{version}
%{_libdir}/%{name}-%{version}/libopenvino_auto_batch_plugin.so
%files -n lib%{name}-auto-plugin
%dir %{_libdir}/%{name}-%{version}
%{_libdir}/%{name}-%{version}/libopenvino_auto_plugin.so
%ifarch x86_64
%files -n lib%{name}-intel-gpu-plugin
%dir %{_libdir}/%{name}-%{version}
%{_libdir}/%{name}-%{version}/libopenvino_intel_gpu_plugin.so
%{_libdir}/%{name}-%{version}/cache.json
%files -n lib%{name}-intel-cpu-plugin
%dir %{_libdir}/%{name}-%{version}
%{_libdir}/%{name}-%{version}/libopenvino_intel_cpu_plugin.so
%endif
%ifarch aarch64
%files -n lib%{name}-arm-cpu-plugin
%dir %{_libdir}/%{name}-%{version}
%{_libdir}/%{name}-%{version}/libopenvino_arm_cpu_plugin.so
%endif
%files -n lib%{name}-hetero-plugin
%dir %{_libdir}/%{name}-%{version}
%{_libdir}/%{name}-%{version}/libopenvino_hetero_plugin.so
%files -n lib%{name}_onnx_frontend
%{_libdir}/libopenvino_onnx_frontend.so.*
%files -n lib%{name}_ir_frontend
%{_libdir}/libopenvino_ir_frontend.so.*
%files -n lib%{name}_paddle_frontend
%{_libdir}/libopenvino_paddle_frontend.so.*
%files -n lib%{name}_pytorch_frontend
%{_libdir}/libopenvino_pytorch_frontend.so.*
%files -n lib%{name}_tensorflow_frontend
%{_libdir}/libopenvino_tensorflow_frontend.so.*
%files -n lib%{name}_tensorflow_lite_frontend
%{_libdir}/libopenvino_tensorflow_lite_frontend.so.*
%files -n %{name}-samples
%license LICENSE
%{_datadir}/%{name}/
%files -n lib%{name}-devel
%license LICENSE
%{_includedir}/%{name}/
%{_libdir}/cmake/%{name}%{version}/
%{_libdir}/*.so
%{_libdir}/pkgconfig/openvino.pc
%changelog
* Thu Nov 21 2024 zhouyi <zhouyi198@h-partners.com> - 2024.3.0-2
- Add lfs config
Edit openvino.spec
* Mon Sep 2 2024 Anokhov,Artyom <artyom.anokhov@intel.com> - 2024.3.0-1
- Initial openvino spec file