Initial commit
This commit is contained in:
11
README.md
11
README.md
@@ -12,10 +12,10 @@
|
|||||||
<a href="https://arxiv.org/abs/2406.09414"><img src='https://img.shields.io/badge/arXiv-Depth Anything V2-red' alt='Paper PDF'></a>
|
<a href="https://arxiv.org/abs/2406.09414"><img src='https://img.shields.io/badge/arXiv-Depth Anything V2-red' alt='Paper PDF'></a>
|
||||||
<a href='https://depth-anything-v2.github.io'><img src='https://img.shields.io/badge/Project_Page-Depth Anything V2-green' alt='Project Page'></a>
|
<a href='https://depth-anything-v2.github.io'><img src='https://img.shields.io/badge/Project_Page-Depth Anything V2-green' alt='Project Page'></a>
|
||||||
<a href='https://huggingface.co/spaces/depth-anything/Depth-Anything-V2'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue'></a>
|
<a href='https://huggingface.co/spaces/depth-anything/Depth-Anything-V2'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue'></a>
|
||||||
<a href='https://huggingface.co/datasets/depth-anything/DA-2K'><img src='https://img.shields.io/badge/Benchmark-DA--2K-green' alt='Benchmark'></a>
|
<a href='https://huggingface.co/datasets/depth-anything/DA-2K'><img src='https://img.shields.io/badge/Benchmark-DA--2K-yellow' alt='Benchmark'></a>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
This work presents Depth Anything V2. It significantly outperforms V1 in fine-grained details and robustness. Compared with SD-based models, it enjoys faster inference speed, fewer parameters, and higher depth accuracy.
|
This work presents Depth Anything V2. It significantly outperforms [V1](https://github.com/LiheYoung/Depth-Anything) in fine-grained details and robustness. Compared with SD-based models, it enjoys faster inference speed, fewer parameters, and higher depth accuracy.
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
@@ -126,4 +126,11 @@ If you find this project useful, please consider citing:
|
|||||||
journal={arXiv:2406.09414},
|
journal={arXiv:2406.09414},
|
||||||
year={2024}
|
year={2024}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@inproceedings{depth_anything_v1,
|
||||||
|
title={Depth Anything: Unleashing the Power of Large-Scale Unlabeled Data},
|
||||||
|
author={Yang, Lihe and Kang, Bingyi and Huang, Zilong and Xu, Xiaogang and Feng, Jiashi and Zhao, Hengshuang},
|
||||||
|
booktitle={CVPR},
|
||||||
|
year={2024}
|
||||||
|
}
|
||||||
```
|
```
|
||||||
Reference in New Issue
Block a user