--- title: - Downloading & Uploading functions for package registry keywords: fastai sidebar: home_sidebar nb_path: "nbs/data.loader.ipynb" ---
{% raw %}
{% endraw %} {% raw %}
{% endraw %} {% raw %}
{% endraw %} {% raw %}

find_git_repo[source]

find_git_repo()

{% endraw %} {% raw %}
{% endraw %} {% raw %}

get_registry_api_key[source]

get_registry_api_key()

{% endraw %} {% raw %}
{% endraw %} {% raw %}

class upload_in_chunks[source]

upload_in_chunks(filename, chunksize=8192)

{% endraw %} {% raw %}

class IterableToFileAdapter[source]

IterableToFileAdapter(iterable)

{% endraw %} {% raw %}
{% endraw %} {% raw %}

write_file_to_package_registry[source]

write_file_to_package_registry(project_id, file_path, api_key, version='0.0.1')

{% endraw %} {% raw %}
{% endraw %} {% raw %}

project_id_from_name[source]

project_id_from_name(project_name, api_key)

{% endraw %} {% raw %}
{% endraw %} {% raw %}

write_huggingface_model_to_package_registry[source]

write_huggingface_model_to_package_registry(project_name, model)

{% endraw %} {% raw %}
{% endraw %} {% raw %}

write_model_to_package_registry[source]

write_model_to_package_registry(model, project_name=None)

{% endraw %} {% raw %}
{% endraw %} {% raw %}

download_package_file[source]

download_package_file(filename, project_name=None, out_dir=None, package_name='plugin-model-package', package_version='0.0.1')

{% endraw %} {% raw %}
{% endraw %} {% raw %}

download_huggingface_model_for_project[source]

download_huggingface_model_for_project(files=None)

{% endraw %} {% raw %}
{% endraw %} {% raw %}
# todo: cleanup old package files during testing
filename = "config.json"
download_package_file(filename)
downloading config.json from project pymemri, package plugin-model-package
writing config.json to /Users/koen/.memri/projects/pymemri
PosixPath('/Users/koen/.memri/projects/pymemri')
{% endraw %}

- Transformers tests

{% raw %}
from transformers import AutoModelForSequenceClassification, AutoTokenizer
from transformers import AutoModel
model = AutoModelForSequenceClassification.from_pretrained("distilroberta-base", num_labels=10)
Some weights of the model checkpoint at distilroberta-base were not used when initializing RobertaForSequenceClassification: ['roberta.pooler.dense.weight', 'roberta.pooler.dense.bias', 'lm_head.dense.weight', 'lm_head.layer_norm.weight', 'lm_head.layer_norm.bias', 'lm_head.bias', 'lm_head.dense.bias', 'lm_head.decoder.weight']
- This IS expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Some weights of RobertaForSequenceClassification were not initialized from the model checkpoint at distilroberta-base and are newly initialized: ['classifier.dense.weight', 'classifier.out_proj.bias', 'classifier.out_proj.weight', 'classifier.dense.bias']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
{% endraw %} {% raw %}
write_model_to_package_registry(model)
{% endraw %} {% raw %}
out_dir = download_huggingface_model_for_project()
{% endraw %}