- 博客(575)
- 资源 (14)
- 收藏
- 关注
原创 txtai系列教程
txtai系列教程翻译自 : https://dev.to/neuml/export-and-run-models-with-onnx-fof1.txtai 初识2.使用 Hugging Face 数据集构建 Embeddings 索引3.从数据源构建嵌入索引4.将语义搜索添加到 Elasticsearch5.使用 txtai 进行抽取式 QA6.使用 Elasticsearch 进行抽取式 QA7.使用零样本分类应用标签8.txtai API 库9.构建抽象文本摘要10.从文档中提
2021-12-17 13:20:10 1605 1
转载 Supervisor进程管理
https://blog.csdn.net/lly1122334/article/details/122713267
2023-12-24 20:49:36 171
原创 2023-ICLR-Adaptive Budget Allocation for Parameter-Efficient Fine-Tuning
2023-ICLR-Adaptive Budget Allocation for Parameter-Efficient Fine-Tuning
2023-11-01 11:04:58 236
转载 2023-arxiv-LLaMA-Adapter Efficient Fine-tuning of Language Models with Zero-init Attention
2023-arxiv-LLaMA-Adapter Efficient Fine-tuning of Language Models with Zero-init Attention
2023-11-01 10:19:08 114
转载 2022-arxiv-Few-Shot Parameter-Efficient Fine-Tuning is Better and Cheaper than In-Context Learning
少样本参数高效微调比上下文学习更好、更便宜
2023-10-31 18:01:11 180
原创 2021-arxiv-LoRA Low-Rank Adaptation of Large Language Models
2021-arxiv-LoRA Low-Rank Adaptation of Large Language Models
2023-10-30 11:10:59 414
原创 2022-arxiv-P-Tuning v2 Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and
2022-arxiv-P-Tuning v2 Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and
2023-10-30 09:53:20 58
原创 2021-arXiv-The Power of Scale for Parameter-Efficient Prompt Tuning
2021-arXiv-The Power of Scale for Parameter-Efficient Prompt Tuning
2023-10-25 09:57:01 609
原创 2021-arxiv-Prefix-Tuning- Optimizing Continuous Prompts for Generation
2021-arxiv-Prefix-Tuning- Optimizing Continuous Prompts for Generation
2023-10-19 13:18:00 900
BBBP BACE ClinTox Tox21 ToxCast SIDER HIV PCBA MUV
2022-05-28
空空如也
TA创建的收藏夹 TA关注的收藏夹
TA关注的人