Index index by Group index by Distribution index by Vendor index by creation date index by Name Mirrors Help Search

python310-torch-openmpi4-converters-2.5.0-2.1 RPM for noarch

From OpenSuSE Ports Tumbleweed for noarch

Name: python310-torch-openmpi4-converters Distribution: openSUSE:Factory:zSystems
Version: 2.5.0 Vendor: openSUSE
Release: 2.1 Build date: Tue Dec 17 10:32:47 2024
Group: Development/Languages/Python Build host: reproducible
Size: 2128 Source RPM: python-torch-openmpi4-2.5.0-2.1.src.rpm
Packager: https://bugs.opensuse.org
Url: https://pytorch.org
Summary: Converters for onnx and caffe2
Converter from caffe2 to onnx and from caffe2 to onnx formated files.

Provides

Requires

License

Apache-2.0 AND BSD-2-Clause AND BSD-3-Clause AND MIT AND Zlib AND BSL-1.0

Changelog

* Tue Dec 17 2024 Andreas Schwab <schwab@suse.de>
  - Use oneDNN only on x86_64 aarch64 ppc64le
* Fri Oct 18 2024 Guillaume GARDET <guillaume.gardet@opensuse.org>
  -  Update to 2.5.0:
    * https://github.com/pytorch/pytorch/releases/tag/v2.5.0
* Fri Oct 04 2024 Guillaume GARDET <guillaume.gardet@opensuse.org>
  - Add patch to fix build with oneDNN:
    * pytorch-patch-onednn.patch
* Tue Oct 01 2024 Guillaume GARDET <guillaume.gardet@opensuse.org>
  - Update to 2.4.1:
    * https://github.com/pytorch/pytorch/releases/tag/v2.4.1
  - Skip update to 2.4.0:
    * https://github.com/pytorch/pytorch/releases/tag/v2.4.0
  - Remove _service since 'osc mr download_files' is easier to use
    and maintain
  - Drop config vars not used anymore: BUILD_CAFFE2, USE_LEVELDB, USE_LMDB,
    USE_OPENCV, USE_TBB
  - Remove examples package since code has been removed upstream
  - Refresh pacth:
    * skip-third-party-check.patch
* Thu Aug 29 2024 Guang Yee <gyee@suse.com>
  - Enable sle15_python_module_pythons.
  - GCC 9.3 or newer is required, regardless if CUDA is enabled.
    See https://github.com/pytorch/pytorch/blob/v2.3.1/CMakeLists.txt#L48
    Therefore, for SLE15 we went with GCC 11 as it seems to be the most
    common one.
  - Use %gcc_version macro for Tumbleweed.
* Thu Jul 11 2024 Christian Goll <cgoll@suse.com>
  - update to 2.3.1 with following summarized highlights:
    * from 2.0.x:
    - torch.compile is the main API for PyTorch 2.0, which wraps your model and
      returns a compiled model. It is a fully additive (and optional) feature
      and hence 2.0 is 100% backward compatible by definition
    - Accelerated Transformers introduce high-performance support for training
      and inference using a custom kernel architecture for scaled dot product
      attention (SPDA). The API is integrated with torch.compile() and model
      developers may also use the scaled dot product attention kernels directly
      by calling the new scaled_dot_product_attention() operato
    * from 2.1.x:
    - automatic dynamic shape support in torch.compile,
      torch.distributed.checkpoint for saving/loading distributed training jobs
      on multiple ranks in parallel, and torch.compile support for the NumPy
      API.
    - In addition, this release offers numerous performance improvements (e.g.
      CPU inductor improvements, AVX512 support, scaled-dot-product-attention
      support) as well as a prototype release of torch.export, a sound
      full-graph capture mechanism, and torch.export-based quantization.
    * from 2.2.x:
    - 2x performance improvements to scaled_dot_product_attention via
      FlashAttention-v2 integration, as well as AOTInductor, a new
      ahead-of-time compilation and deployment tool built for non-python
      server-side deployments.
    * from 2.3.x:
    - support for user-defined Triton kernels in torch.compile, allowing for
      users to migrate their own Triton kernels from eager without
      experiencing performance complications or graph breaks. As well, Tensor
      Parallelism improves the experience for training Large Language Models
      using native PyTorch functions, which has been validated on training
      runs for 100B parameter models.
  - added seperate openmpi4 build
  - added sepetate vulcan build, although this functions isn't exposed to python
    abi
  - For the obs build all the vendored sources follow the pattern
    NAME-7digitcommit.tar.gz and not the NAME-COMMIT.tar.gz
  - added following patches:
    * skip-third-party-check.patch
    * fix-setup.patch
  - removed patches:
    * pytorch-rm-some-gitmodules.patch
    * fix-call-of-onnxInitGraph.patch
* Thu Jul 22 2021 Guillaume GARDET <guillaume.gardet@opensuse.org>
  - Fix build on x86_64 by using GCC10 instead of GCC11
    https://github.com/google/XNNPACK/issues/1550
* Thu Jul 22 2021 Guillaume GARDET <guillaume.gardet@opensuse.org>
  - Update to 1.9.0
  - Release notes: https://github.com/pytorch/pytorch/releases/tag/v1.9.0
  - Drop upstreamed patch:
    * fix-mov-operand-for-gcc.patch
  - Drop unneeded patches:
    * removed-peachpy-depedency.patch
  - Refresh patches:
    * skip-third-party-check.patch
    * fix-call-of-onnxInitGraph.patch
  - Add new patch:
    * pytorch-rm-some-gitmodules.patch
* Thu Jul 22 2021 Guillaume GARDET <guillaume.gardet@opensuse.org>
  - Add _service file to ease future update of deps
* Thu Jul 22 2021 Guillaume GARDET <guillaume.gardet@opensuse.org>
  - Update sleef to fix build on aarch64
* Fri Apr 23 2021 Matej Cepl <mcepl@suse.com>
  - Don't build python36-* package (missing pandas)
* Thu Jan 21 2021 Benjamin Greiner <code@bnavigator.de>
  - Fix python-rpm-macros usage

Files

/etc/alternatives/convert-caffe2-to-onnx
/etc/alternatives/convert-onnx-to-caffe2
/usr/bin/convert-caffe2-to-onnx
/usr/bin/convert-caffe2-to-onnx-3.10
/usr/bin/convert-onnx-to-caffe2
/usr/bin/convert-onnx-to-caffe2-3.10


Generated by rpm2html 1.8.1

Fabrice Bellet, Sat Jan 4 23:43:41 2025