Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add openai-whisper #26360

Open
wants to merge 10 commits into
base: main
Choose a base branch
from
Open

Add openai-whisper #26360

wants to merge 10 commits into from

Conversation

hoxbro
Copy link
Contributor

@hoxbro hoxbro commented May 15, 2024

Checklist

  • Title of this PR is meaningful: e.g. "Adding my_nifty_package", not "updated meta.yaml".
  • License file is packaged (see here for an example).
  • Source is from official source.
  • Package does not vendor other packages. (If a package uses the source of another package, they should be separate packages or the licenses of all packages need to be packaged).
  • If static libraries are linked in, the license of the static library is packaged.
  • Package does not ship static libraries. If static libraries are needed, follow CFEP-18.
  • Build number is 0.
  • A tarball (url) rather than a repo (e.g. git_url) is used in your recipe (see here for more details).
  • GitHub users listed in the maintainer section have posted a comment confirming they are willing to be listed there.
  • When in trouble, please check our knowledge base documentation before pinging a team.

@conda-forge-webservices
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I wanted to let you know that I linted all conda-recipes in your PR (recipes/openai-whisper) and found some lint.

Here's what I've got...

For recipes/openai-whisper:

  • noarch packages can't have selectors. If the selectors are necessary, please remove noarch: python.

@conda-forge-webservices
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I wanted to let you know that I linted all conda-recipes in your PR (recipes/openai-whisper) and found some lint.

Here's what I've got...

For recipes/openai-whisper:

  • Non noarch packages should have python requirement without any version constraints.
  • Non noarch packages should have python requirement without any version constraints.

@conda-forge-webservices
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipes/openai-whisper) and found it was in an excellent condition.

@conda-forge-webservices
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipes/openai-whisper) and found it was in an excellent condition.

I do have some suggestions for making it better though...

For recipes/openai-whisper:

  • Please depend on pytorch directly, in order to avoid forcing CUDA users to downgrade to the CPU version for no reason.
  • Please depend on pytorch directly. If your package definitely requires the CUDA version, please depend on pytorch =*=cuda*.
  • In your conda_build_config.yaml, please change the name of MACOSX_DEPLOYMENT_TARGET, to c_stdlib_version!

@conda-forge-webservices
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipes/openai-whisper) and found it was in an excellent condition.

I do have some suggestions for making it better though...

For recipes/openai-whisper:

  • In your conda_build_config.yaml, please change the name of MACOSX_DEPLOYMENT_TARGET, to c_stdlib_version!

@hoxbro hoxbro closed this May 20, 2024
@hoxbro hoxbro reopened this May 20, 2024
@hoxbro hoxbro closed this May 20, 2024
@hoxbro hoxbro reopened this May 20, 2024
@conda-forge-webservices
Copy link

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipes/openai-whisper) and found it was in an excellent condition.

- tiktoken
- ffmpeg
run_constrained:
- triton >=2.0.0,<3
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't know what the best way to add this. Currently, it looks like CUDA is the only support.

openai/whisper@main/requirements.txt#L7

- whisper=whisper.transcribe:cli
script: {{ PYTHON }} -m pip install . -vv --no-deps
number: 0
skip: true # [win]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I could not get this to build on Windows because of pytorch

@hoxbro hoxbro marked this pull request as ready for review May 26, 2024 12:31
@hoxbro
Copy link
Contributor Author

hoxbro commented May 26, 2024

@conda-forge/help-python, this should be ready for review; I have left some comments with parts I'm unsure about.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging this pull request may close these issues.

None yet

1 participant