书生大模型实战营第四期-入门岛-4. maas课程任务

时间:2024-11-28 07:13:53

书生大模型实战营第四期-入门岛-4. maas课程任务

任务一、模型下载

任务内容

使用Hugging Face平台、魔搭社区平台(可选)和魔乐社区平台(可选)下载文档中提到的模型(至少需要下载config.json文件、model.safetensors.index.json文件),请在必要的步骤以及结果当中截图。

作业过程

下载internlm2_5-7b-chat的配置文件

新建一个hf_download_josn.py 文件,内容如下:

import os
from huggingface_hub import hf_hub_download

# 指定模型标识符
repo_id = "internlm/internlm2_5-7b"

# 指定要下载的文件列表
files_to_download = [
    {"filename": "config.json"},
    {"filename": "model.safetensors.index.json"}
]

# 创建一个目录来存放下载的文件
local_dir = f"{repo_id.split('/')[1]}"
os.makedirs(local_dir, exist_ok=True)

# 遍历文件列表并下载每个文件
for file_info in files_to_download:
    file_path = hf_hub_download(
        repo_id=repo_id,
        filename=file_info["filename"],
        local_dir=local_dir
    )
    print(f"{file_info['filename']} file downloaded to: {file_path}")

L0-maas-task4-hf-download

下载internlm2_5-chat-1_8b并打印示例输出

创建hf_download_1_8_demo.py文件,内容如下:

import torch
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("internlm/internlm2_5-1_8b", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("internlm/internlm2_5-1_8b", torch_dtype=torch.float16, trust_remote_code=True)
model = model.eval()

inputs = tokenizer(["A beautiful flower"], return_tensors="pt")
gen_kwargs = {
    "max_length": 128,
    "top_p": 0.8,
    "temperature": 0.8,
    "do_sample": True,
    "repetition_penalty": 1.0
}

# 以下内容可选,如果解除注释等待一段时间后可以看到模型输出
output = model.generate(**inputs, **gen_kwargs)
output = tokenizer.decode(output[0].tolist(), skip_special_tokens=True)
print(output)

L0-maas-task4-hf-demo

任务二、模型上传(可选)

作业内容

将我们下载好的config.json文件(也自行添加其他模型相关文件)上传到对应HF平台和魔搭社区平台,并截图。

作业过程

上传模型文件到HF平台

通过CLI上传 Hugging Face同样是跟Git相关联,通常大模型的模型文件都比较大,因此我们需要安装git lfs,对大文件系统支持。

curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | sudo bash
# sudo apt-get install git-lfs # CodeSpace里面可能会有aptkey冲突且没有足够权限
git lfs install # 直接在git环境下配置git LFS
pip install huggingface_hub

在github的CodeSpace里面:

git config --global credential.helper store
huggingface-cli login

命令行登录hf,输入token(在hg创建并获取):

@lldhsds ➜ /workspaces/codespaces-jupyter (main) $ git config --global credential.helper store
@lldhsds ➜ /workspaces/codespaces-jupyter (main) $ huggingface-cli login

    _|    _|  _|    _|    _|_|_|    _|_|_|  _|_|_|  _|      _|    _|_|_|      _|_|_|_|    _|_|      _|_|_|  _|_|_|_|
    _|    _|  _|    _|  _|        _|          _|    _|_|    _|  _|            _|        _|    _|  _|        _|
    _|_|_|_|  _|    _|  _|  _|_|  _|  _|_|    _|    _|  _|  _|  _|  _|_|      _|_|_|    _|_|_|_|  _|        _|_|_|
    _|    _|  _|    _|  _|    _|  _|    _|    _|    _|    _|_|  _|    _|      _|        _|    _|  _|        _|
    _|    _|    _|_|      _|_|_|    _|_|_|  _|_|_|  _|      _|    _|_|_|      _|        _|    _|    _|_|_|  _|_|_|_|

    To log in, `huggingface_hub` requires a token generated from https://huggingface.co/settings/tokens .
Enter your token (input will not be visible): 
Add token as git credential? (Y/n) Y
Token is valid (permission: write).
The token `hf_internlm_test` has been saved to /home/codespace/.cache/huggingface/stored_tokens
Your token has been saved in your configured git credential helpers (store).
Your token has been saved to /home/codespace/.cache/huggingface/token
Login successful.
The current active token is: `hf_internlm_test`

创建hg项目:

# intern_study_L0_4就是model_name
@lldhsds ➜ /workspaces/codespaces-jupyter (main) $ huggingface-cli repo create intern_study_L0_4
git version 2.47.0
git-lfs/3.5.1 (GitHub; linux amd64; go 1.21.8)

You are about to create lldhsds/intern_study_L0_4
Proceed? [Y/n] Y

Your repo now lives at:
  https://huggingface.co/lldhsds/intern_study_L0_4

You can clone it locally with the command below, and commit/push as usual.

  git clone https://huggingface.co/lldhsds/intern_study_L0_4

# 将上面创建的项目克隆到本地
@lldhsds ➜ /workspaces/codespaces-jupyter (main) $ git clone https://huggingface.co/lldhsds/intern_study_L0_4
Cloning into 'intern_study_L0_4'...
remote: Enumerating objects: 3, done.
remote: Total 3 (delta 0), reused 0 (delta 0), pack-reused 3 (from 1)
Unpacking objects: 100% (3/3), 1.05 KiB | 1.05 MiB/s, done.

更新项目并推送到远程仓库:

# 添加README.md文件
@lldhsds ➜ /workspaces/codespaces-jupyter/intern_study_L0_4 (main) $ vim README.md

@lldhsds ➜ /workspaces/codespaces-jupyter/intern_study_L0_4 (main) $ git add .
@lldhsds ➜ /workspaces/codespaces-jupyter/intern_study_L0_4 (main) $ git commit -m "add:intern_study_L0_4"
[main 380529d] add:intern_study_L0_4
 1 file changed, 3 insertions(+)
 create mode 100644 README.md
@lldhsds ➜ /workspaces/codespaces-jupyter/intern_study_L0_4 (main) $ git push
remote: Password authentication in git is no longer supported. You must use a user access token or an SSH key instead. See https://huggingface.co/blog/password-git-deprecation
fatal: Authentication failed for 'https://huggingface.co/lldhsds/intern_study_L0_4/'
@lldhsds ➜ /workspaces/codespaces-jupyter/intern_study_L0_4 (main) $ export hf_token="xxx"	# 此处设置hf access key

@lldhsds ➜ /workspaces/codespaces-jupyter/intern_study_L0_4 (main) $ git remote set-url origin  https://lldhsds:$hf_token@huggingface.co/lldhsds/intern_study_L0_4
@lldhsds ➜ /workspaces/codespaces-jupyter/intern_study_L0_4 (main) $ git push
Enumerating objects: 4, done.
Counting objects: 100% (4/4), done.
Delta compression using up to 4 threads
Compressing objects: 100% (3/3), done.
Writing objects: 100% (3/3), 449 bytes | 449.00 KiB/s, done.
Total 3 (delta 0), reused 0 (delta 0), pack-reused 0 (from 0)
remote: -------------------------------------------------------------------------
remote: Your push was accepted, but with warnings: 
remote: - Warning: empty or missing yaml metadata in repo card
remote: help: https://huggingface.co/docs/hub/model-cards#model-card-metadata
remote: -------------------------------------------------------------------------
remote: -------------------------------------------------------------------------
remote: Please find the documentation at:
remote: https://huggingface.co/docs/hub/model-cards#model-card-metadata
remote: 
remote: -------------------------------------------------------------------------
To https://huggingface.co/lldhsds/intern_study_L0_4
   510dcc0..380529d  main -> main

仓库信息:

L0-maas-task4-hf-repo

上传模型文件到魔搭社区平台
  1. 魔搭社区注册登录,并创建model

L0-maas-modelscope-model

  1. 下载模型并修改
git clone https://www.modelscope.cn/lldhsds/intern_study_L0_4.git
  1. 上传模型文件
(/root/share/pre_envs/pytorch2.1.2cu12.1) root@intern-studio-50014188:~/intern_study_L0_4# ls
README.md  configuration.json

# 创建config.json
(/root/share/pre_envs/pytorch2.1.2cu12.1) root@intern-studio-50014188:~/intern_study_L0_4# vim config.json

# git提交三部曲
(/root/share/pre_envs/pytorch2.1.2cu12.1) root@intern-studio-50014188:~/intern_study_L0_4# git add .
(/root/share/pre_envs/pytorch2.1.2cu12.1) root@intern-studio-50014188:~/intern_study_L0_4# git commit -m "add:intern_study_L0_4"
[master ac42332] add:intern_study_L0_4
 1 file changed, 35 insertions(+)
 create mode 100644 config.json

# 这一步需要进行验证,moodelscope用户名+ moodelscope git token
(/root/share/pre_envs/pytorch2.1.2cu12.1) root@intern-studio-50014188:~/intern_study_L0_4# git push
Enumerating objects: 4, done.
Counting objects: 100% (4/4), done.
Delta compression using up to 128 threads
Compressing objects: 100% (3/3), done.
Writing objects: 100% (3/3), 692 bytes | 57.00 KiB/s, done.
Total 3 (delta 1), reused 0 (delta 0)
To https://www.modelscope.cn/lldhsds/intern_study_L0_4.git
   5acaa46..ac42332  master -> master
  1. 查看仓库

上传的文件,modelspace需要审核。

L0-maas-modelscope-model-up

任务三、Space上传(可选)

作业内容

在HF平台上使用Spaces并把intern_cobuild部署成功,关键步骤截图。

作业过程

访问下面链接https://huggingface.co/spaces,点击右上角的Create new Space创建项目,输入项目名为intern_cobuild,并选择Static应用进行创建,创建成功后会自动跳转到一个默认的HTML页面。

创建好项目后,回到githuab的CodeSpace,接着clone并修改项目:

@lldhsds ➜ /workspaces/codespaces-jupyter (main) $ git clone https://huggingface.co/spaces/lldhsds/intern_cobuild
Cloning into 'intern_cobuild'...
remote: Enumerating objects: 6, done.
remote: Total 6 (delta 0), reused 0 (delta 0), pack-reused 6 (from 1)
Unpacking objects: 100% (6/6), 1.88 KiB | 1.88 MiB/s, done.
@lldhsds ➜ /workspaces/codespaces-jupyter (main) $ cd intern_cobuild/
@lldhsds ➜ /workspaces/codespaces-jupyter/intern_cobuild (main) $ ls
README.md  index.html  style.css
@lldhsds ➜ /workspaces/codespaces-jupyter/intern_cobuild (main) $ mv index.html index.html.bak
@lldhsds ➜ /workspaces/codespaces-jupyter/intern_cobuild (main) $ vim index.html 

index.html文件内容修改如下:

<!doctype html>
<html>
<head>
  <meta charset="utf-8" />
  <meta name="viewport" content="width=device-width" />
  <title>My static Space</title>
  <style>
    html, body {
      margin: 0;
      padding: 0;
      height: 100%;
    }
    body {
      display: flex;
      justify-content: center;
      align-items: center;
    }
    iframe {
      width: 430px;
      height: 932px;
      border: none;
    }
  </style>
</head>
<body>
  <iframe src="https://colearn.intern-ai.org.cn/cobuild" title="description"></iframe>
</body>
</html>

推送修改:

@lldhsds ➜ /workspaces/codespaces-jupyter/intern_cobuild (main) $ export hf_token="xxx" # 这里操作时改为
@lldhsds ➜ /workspaces/codespaces-jupyter/intern_cobuild (main) $ git remote set-url origin https://lldhsds:$hf_token@huggingface.co/spaces/lldhsds/intern_cobuild/
@lldhsds ➜ /workspaces/codespaces-jupyter/intern_cobuild (main) $ git push
Enumerating objects: 5, done.
Counting objects: 100% (5/5), done.
Delta compression using up to 4 threads
Compressing objects: 100% (3/3), done.
Writing objects: 100% (3/3), 596 bytes | 596.00 KiB/s, done.
Total 3 (delta 1), reused 0 (delta 0), pack-reused 0 (from 0)
To https://huggingface.co/spaces/lldhsds/intern_cobuild/
   fb177af..3b8fef2  main -> main

推送成功后查看Space界面,界面已更新:

L0-maas-task4-hf-space