Skip to content

Commit 0801d27

Browse files
Remove PyTorch 2.3 support for Intel GPU (#13097)
* Remove PyTorch 2.3 installation option for GPU * Remove xpu_lnl option in installation guides for docs * Update BMG quickstart * Remove PyTorch 2.3 dependencies for GPU examples * Update the graphmode example to use stable version 2.2.0 * Fix based on comments
1 parent a2a35fd commit 0801d27

File tree

9 files changed

+92
-291
lines changed

9 files changed

+92
-291
lines changed

docs/mddocs/Overview/install_gpu.md

+28-74
Original file line numberDiff line numberDiff line change
@@ -46,93 +46,47 @@ We recommend using [Miniforge](https://conda-forge.org/download/) to create a py
4646
4747
The easiest ways to install `ipex-llm` is the following commands.
4848

49-
- For **Intel Core™ Ultra Processors (Series 2) with processor number 2xxV (code name Lunar Lake)**:
49+
Choose either US or CN website for `extra-index-url`:
5050

51-
Choose either US or CN website for `extra-index-url`:
51+
- For **US**:
5252

53-
- For **US**:
54-
55-
```cmd
56-
conda create -n llm python=3.11 libuv
57-
conda activate llm
58-
59-
pip install --pre --upgrade ipex-llm[xpu_lnl] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/lnl/us/
60-
```
61-
62-
- For **CN**:
63-
64-
```cmd
65-
conda create -n llm python=3.11 libuv
66-
conda activate llm
67-
68-
pip install --pre --upgrade ipex-llm[xpu_lnl] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/lnl/cn/
69-
```
70-
71-
- For **other Intel iGPU and dGPU**:
72-
73-
Choose either US or CN website for `extra-index-url`:
74-
75-
- For **US**:
76-
77-
```cmd
78-
conda create -n llm python=3.11 libuv
79-
conda activate llm
53+
```cmd
54+
conda create -n llm python=3.11 libuv
55+
conda activate llm
8056
81-
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
82-
```
57+
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
58+
```
8359

84-
- For **CN**:
60+
- For **CN**:
8561

86-
```cmd
87-
conda create -n llm python=3.11 libuv
88-
conda activate llm
62+
```cmd
63+
conda create -n llm python=3.11 libuv
64+
conda activate llm
8965
90-
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
91-
```
66+
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
67+
```
9268

9369
#### Install IPEX-LLM From Wheel
9470

9571
If you encounter network issues when installing IPEX, you can also install IPEX-LLM dependencies for Intel XPU from source archives. First you need to download and install torch/torchvision/ipex from wheels listed below before installing `ipex-llm`.
9672

97-
- For **Intel Core™ Ultra Processors (Series 2) with processor number 2xxV (code name Lunar Lake)**:
98-
99-
Download the wheels on Windows system:
100-
101-
```
102-
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/lnl/torch-2.3.1%2Bcxx11.abi-cp311-cp311-win_amd64.whl
103-
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/lnl/torchvision-0.18.1%2Bcxx11.abi-cp311-cp311-win_amd64.whl
104-
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/lnl/intel_extension_for_pytorch-2.3.110%2Bxpu-cp311-cp311-win_amd64.whl
105-
```
106-
107-
You may install dependencies directly from the wheel archives and then install `ipex-llm` using following commands:
108-
109-
```
110-
pip install torch-2.3.1+cxx11.abi-cp311-cp311-win_amd64.whl
111-
pip install torchvision-0.18.1+cxx11.abi-cp311-cp311-win_amd64.whl
112-
pip install intel_extension_for_pytorch-2.3.110+xpu-cp311-cp311-win_amd64.whl
113-
114-
pip install --pre --upgrade ipex-llm[xpu_lnl]
115-
```
73+
Download the wheels on Windows system:
11674

117-
- For **other Intel iGPU and dGPU**:
118-
119-
Download the wheels on Windows system:
120-
121-
```
122-
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/torch-2.1.0a0%2Bcxx11.abi-cp311-cp311-win_amd64.whl
123-
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/torchvision-0.16.0a0%2Bcxx11.abi-cp311-cp311-win_amd64.whl
124-
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/intel_extension_for_pytorch-2.1.10%2Bxpu-cp311-cp311-win_amd64.whl
125-
```
75+
```
76+
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/torch-2.1.0a0%2Bcxx11.abi-cp311-cp311-win_amd64.whl
77+
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/torchvision-0.16.0a0%2Bcxx11.abi-cp311-cp311-win_amd64.whl
78+
wget https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/intel_extension_for_pytorch-2.1.10%2Bxpu-cp311-cp311-win_amd64.whl
79+
```
12680

127-
You may install dependencies directly from the wheel archives and then install `ipex-llm` using following commands:
81+
You may install dependencies directly from the wheel archives and then install `ipex-llm` using following commands:
12882

129-
```
130-
pip install torch-2.1.0a0+cxx11.abi-cp311-cp311-win_amd64.whl
131-
pip install torchvision-0.16.0a0+cxx11.abi-cp311-cp311-win_amd64.whl
132-
pip install intel_extension_for_pytorch-2.1.10+xpu-cp311-cp311-win_amd64.whl
83+
```
84+
pip install torch-2.1.0a0+cxx11.abi-cp311-cp311-win_amd64.whl
85+
pip install torchvision-0.16.0a0+cxx11.abi-cp311-cp311-win_amd64.whl
86+
pip install intel_extension_for_pytorch-2.1.10+xpu-cp311-cp311-win_amd64.whl
13387
134-
pip install --pre --upgrade ipex-llm[xpu]
135-
```
88+
pip install --pre --upgrade ipex-llm[xpu]
89+
```
13690

13791
> [!NOTE]
13892
> All the wheel packages mentioned here are for Python 3.11. If you would like to use Python 3.9 or 3.10, you should modify the wheel names for ``torch``, ``torchvision``, and ``intel_extension_for_pytorch`` by replacing ``cp11`` with ``cp39`` or ``cp310``, respectively.
@@ -453,7 +407,7 @@ We recommend using [Miniforge](https://conda-forge.org/download/) to create a py
453407
> The ``xpu`` option will install IPEX-LLM with PyTorch 2.1 by default, which is equivalent to
454408
>
455409
> ```bash
456-
> pip install --pre --upgrade ipex-llm[xpu_2.1] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/> xpu/us/
410+
> pip install --pre --upgrade ipex-llm[xpu_2.1] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
457411
> ```
458412
459413
- For **CN**:
@@ -470,7 +424,7 @@ We recommend using [Miniforge](https://conda-forge.org/download/) to create a py
470424
> The ``xpu`` option will install IPEX-LLM with PyTorch 2.1 by default, which is equivalent to
471425
>
472426
> ```bash
473-
> pip install --pre --upgrade ipex-llm[xpu_2.1] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/> xpu/cn/
427+
> pip install --pre --upgrade ipex-llm[xpu_2.1] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
474428
> ```
475429
476430
- For **PyTorch 2.0** (deprecated for versions ``ipex-llm >= 2.1.0b20240511``):

docs/mddocs/Quickstart/bmg_quickstart.md

+24-39
Original file line numberDiff line numberDiff line change
@@ -67,27 +67,18 @@ conda activate llm
6767
With the `llm` environment active, install the appropriate `ipex-llm` package based on your use case:
6868

6969
#### For PyTorch and HuggingFace:
70-
Install the `ipex-llm[xpu-arc]` package. Choose either the US or CN website for `extra-index-url`:
70+
Install the `ipex-llm[xpu_2.6]` package:
7171

72-
- For **US**:
73-
```bash
74-
pip install --pre --upgrade ipex-llm[xpu-arc] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
75-
```
76-
77-
- For **CN**:
78-
```bash
79-
pip install --pre --upgrade ipex-llm[xpu-arc] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
80-
```
72+
```bash
73+
pip install --pre --upgrade ipex-llm[xpu_2.6] --extra-index-url https://download.pytorch.org/whl/xpu
74+
```
8175

8276
#### For llama.cpp and Ollama:
83-
Install the `ipex-llm[cpp]` package.
84-
85-
```bash
86-
pip install --pre --upgrade ipex-llm[cpp]
87-
```
77+
Install the `ipex-llm[cpp]` package:
8878

89-
> [!NOTE]
90-
> If you encounter network issues during installation, refer to the [troubleshooting guide](../Overview/install_gpu.md#install-ipex-llm-from-wheel-1) for alternative steps.
79+
```bash
80+
pip install --pre --upgrade ipex-llm[cpp]
81+
```
9182

9283
---
9384

@@ -106,7 +97,7 @@ If your driver version is lower than `32.0.101.6449/32.0.101.101.6256`, update i
10697
Download and install Miniforge for Windows from the [official page](https://conda-forge.org/download/). After installation, create and activate a Python environment:
10798

10899
```cmd
109-
conda create -n llm python=3.11 libuv
100+
conda create -n llm python=3.11
110101
conda activate llm
111102
```
112103

@@ -117,27 +108,18 @@ conda activate llm
117108
With the `llm` environment active, install the appropriate `ipex-llm` package based on your use case:
118109

119110
#### For PyTorch and HuggingFace:
120-
Install the `ipex-llm[xpu-arc]` package. Choose either the US or CN website for `extra-index-url`:
111+
Install the `ipex-llm[xpu_2.6]` package:
121112

122-
- For **US**:
123-
```cmd
124-
pip install --pre --upgrade ipex-llm[xpu-arc] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
125-
```
126-
127-
- For **CN**:
128-
```cmd
129-
pip install --pre --upgrade ipex-llm[xpu-arc] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
130-
```
113+
```bash
114+
pip install --pre --upgrade ipex-llm[xpu_2.6] --extra-index-url https://download.pytorch.org/whl/xpu
115+
```
131116

132117
#### For llama.cpp and Ollama:
133-
Install the `ipex-llm[cpp]` package.
134-
135-
```cmd
136-
pip install --pre --upgrade ipex-llm[cpp]
137-
```
118+
Install the `ipex-llm[cpp]` package.:
138119

139-
> [!NOTE]
140-
> If you encounter network issues while installing IPEX, refer to [this guide](../Overview/install_gpu.md#install-ipex-llm-from-wheel) for troubleshooting advice.
120+
```cmd
121+
pip install --pre --upgrade ipex-llm[cpp]
122+
```
141123

142124
---
143125

@@ -166,21 +148,24 @@ Run a Quick PyTorch Example:
166148
torch.Size([1, 1, 40, 40])
167149
```
168150

169-
For benchmarks and performance measurement, refer to the [Benchmark Quickstart guide](https://github.com/intel-analytics/ipex-llm/blob/main/docs/mddocs/Quickstart/benchmark_quickstart.md).
151+
> [!TIP]
152+
> Please refer to here ([Linux](./install_pytorch26_gpu.md#runtime-configurations-1) or [Windows](./install_pytorch26_gpu.md#runtime-configurations)) regarding runtime configurations for PyTorch with IPEX-LLM on B-Series GPU.
153+
154+
For benchmarks and performance measurement, refer to the [Benchmark Quickstart guide](./benchmark_quickstart.md).
170155

171156
---
172157

173158
### 3.2 Ollama
174159

175-
To integrate and run with **Ollama**, follow the [Ollama Quickstart guide](https://github.com/intel-analytics/ipex-llm/blob/main/docs/mddocs/Quickstart/ollama_quickstart.md).
160+
To integrate and run with **Ollama**, follow the [Ollama Quickstart guide](./ollama_quickstart.md).
176161

177162
### 3.3 llama.cpp
178163

179-
For instructions on how to run **llama.cpp** with IPEX-LLM, refer to the [llama.cpp Quickstart guide](https://github.com/intel-analytics/ipex-llm/blob/main/docs/mddocs/Quickstart/llama_cpp_quickstart.md).
164+
For instructions on how to run **llama.cpp** with IPEX-LLM, refer to the [llama.cpp Quickstart guide](./llama_cpp_quickstart.md).
180165

181166
### 3.4 vLLM
182167

183-
To set up and run **vLLM**, follow the [vLLM Quickstart guide](https://github.com/intel-analytics/ipex-llm/blob/main/docs/mddocs/Quickstart/vLLM_quickstart.md).
168+
To set up and run **vLLM**, follow the [vLLM Quickstart guide](./vLLM_quickstart.md).
184169

185170
## 4. Troubleshooting
186171

docs/mddocs/Quickstart/install_windows_gpu.md

+13-37
Original file line numberDiff line numberDiff line change
@@ -59,49 +59,25 @@ conda activate llm
5959

6060
With the `llm` environment active, use `pip` to install `ipex-llm` for GPU:
6161

62-
- For **Intel Core™ Ultra Processors (Series 2) with processor number 2xxV (code name Lunar Lake)**:
62+
Choose either US or CN website for `extra-index-url`:
6363

64-
Choose either US or CN website for `extra-index-url`:
64+
- For **US**:
6565

66-
- For **US**:
67-
68-
```cmd
69-
conda create -n llm python=3.11 libuv
70-
conda activate llm
71-
72-
pip install --pre --upgrade ipex-llm[xpu_lnl] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/lnl/us/
73-
```
74-
75-
- For **CN**:
76-
77-
```cmd
78-
conda create -n llm python=3.11 libuv
79-
conda activate llm
80-
81-
pip install --pre --upgrade ipex-llm[xpu_lnl] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/lnl/cn/
82-
```
83-
84-
- For **other Intel iGPU and dGPU**:
85-
86-
Choose either US or CN website for `extra-index-url`:
87-
88-
- For **US**:
89-
90-
```cmd
91-
conda create -n llm python=3.11 libuv
92-
conda activate llm
66+
```cmd
67+
conda create -n llm python=3.11 libuv
68+
conda activate llm
9369
94-
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
95-
```
70+
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
71+
```
9672

97-
- For **CN**:
73+
- For **CN**:
9874

99-
```cmd
100-
conda create -n llm python=3.11 libuv
101-
conda activate llm
75+
```cmd
76+
conda create -n llm python=3.11 libuv
77+
conda activate llm
10278
103-
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
104-
```
79+
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
80+
```
10581

10682
> [!NOTE]
10783
> If you encounter network issues while installing IPEX, refer to [this guide](../Overview/install_gpu.md#install-ipex-llm-from-wheel) for troubleshooting advice.

docs/mddocs/Quickstart/install_windows_gpu.zh-CN.md

+13-34
Original file line numberDiff line numberDiff line change
@@ -60,47 +60,26 @@ conda activate llm
6060
## 安装 `ipex-llm`
6161

6262
`llm` 环境处于激活状态下,使用 `pip` 安装适用于 GPU 的 `ipex-llm`
63-
- **对于处理器编号为 2xxV 的第二代 Intel Core™ Ultra Processors (代号 Lunar Lake)**
6463

65-
可以根据区域选择不同的 `extra-index-url`,提供 US 和 CN 两个选项:
64+
可以根据区域选择不同的 `extra-index-url`,提供 US 和 CN 两个选项:
6665

67-
- **US**:
66+
- **US**:
6867

69-
```cmd
70-
conda create -n llm python=3.11 libuv
71-
conda activate llm
72-
73-
pip install --pre --upgrade ipex-llm[xpu_lnl] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/lnl/us/
74-
```
75-
- **CN**:
76-
77-
```cmd
78-
conda create -n llm python=3.11 libuv
79-
conda activate llm
80-
81-
pip install --pre --upgrade ipex-llm[xpu_lnl] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/lnl/cn/
82-
```
83-
- 对于**其他 Intel iGPU 和 dGPU**:
84-
85-
可以根据区域选择不同的 `extra-index-url`,提供 US 和 CN 两个选项:
86-
87-
- **US**:
88-
89-
```cmd
90-
conda create -n llm python=3.11 libuv
91-
conda activate llm
68+
```cmd
69+
conda create -n llm python=3.11 libuv
70+
conda activate llm
9271
93-
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
94-
```
72+
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/us/
73+
```
9574

96-
- **CN**:
75+
- **CN**:
9776

98-
```cmd
99-
conda create -n llm python=3.11 libuv
100-
conda activate llm
77+
```cmd
78+
conda create -n llm python=3.11 libuv
79+
conda activate llm
10180
102-
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
103-
```
81+
pip install --pre --upgrade ipex-llm[xpu] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
82+
```
10483

10584
> [!NOTE]
10685
> 如果在安装 IPEX 时遇到网络问题,请参阅[本指南](../Overview/install_gpu.md#install-ipex-llm-from-wheel)获取故障排除建议。

python/llm/example/GPU/GraphMode/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ Here, we provide how to run [torch graph mode](https://pytorch.org/blog/optimizi
66
```bash
77
conda create -n ipex-llm python=3.11
88
conda activate ipex-llm
9-
pip install --pre --upgrade ipex-llm[xpu_arc] --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
9+
pip install --pre --upgrade ipex-llm[xpu_arc]==2.2.0 --extra-index-url https://pytorch-extension.intel.com/release-whl/stable/xpu/cn/
1010
pip install --pre pytorch-triton-xpu==3.0.0+1b2f15840e --index-url https://download.pytorch.org/whl/nightly/xpu
1111
conda install -c conda-forge libstdcxx-ng
1212
unset OCL_ICD_VENDORS

0 commit comments

Comments
 (0)