Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/lazy init #1539

Merged
merged 20 commits into from
Apr 2, 2024
Merged

Feat/lazy init #1539

merged 20 commits into from
Apr 2, 2024

Conversation

nathanielsimard
Copy link
Member

Fix #1329

Cargo.toml Outdated Show resolved Hide resolved
@nathanielsimard nathanielsimard marked this pull request as draft March 27, 2024 19:56
Copy link

codecov bot commented Mar 28, 2024

Codecov Report

Attention: Patch coverage is 88.86894% with 62 lines in your changes are missing coverage. Please review.

Project coverage is 86.48%. Comparing base (c4eac86) to head (db0f3f1).

Files Patch % Lines
crates/burn-core/src/module/param/tensor.rs 47.56% 43 Missing ⚠️
crates/burn-core/src/module/param/base.rs 94.11% 7 Missing ⚠️
crates/burn-core/src/record/primitive.rs 44.44% 5 Missing ⚠️
crates/burn-common/src/stub.rs 95.08% 3 Missing ⚠️
crates/burn-core/src/module/param/running.rs 25.00% 3 Missing ⚠️
crates/burn-core/src/nn/prelu.rs 0.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1539       /-   ##
==========================================
  Coverage   86.30%   86.48%    0.17%     
==========================================
  Files         684      684              
  Lines       78216    78093     -123     
==========================================
  Hits        67502    67535       33     
  Misses      10714    10558     -156     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@nathanielsimard nathanielsimard marked this pull request as ready for review March 28, 2024 16:38
Copy link
Member

@louisfd louisfd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, just once sentence to fix

/// To avoid creating locks on already initialized parameter, we wrap the lock inside an
/// Option, the inner option is required for resetting the state onced initialized.
/// TLDR: RwLock(None) only happens on the param reference that is lazy, but was initialized,
/// all other parameters
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't parse this sentence

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel like the last sentence is incomplete 😅

all other parameters [blank].. what?

Copy link
Member

@laggui laggui left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Implementation LGTM!

The examples have been updated but I think we're missing changes to the book, no? We should reflect that.

/// To avoid creating locks on already initialized parameter, we wrap the lock inside an
/// Option, the inner option is required for resetting the state onced initialized.
/// TLDR: RwLock(None) only happens on the param reference that is lazy, but was initialized,
/// all other parameters
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I feel like the last sentence is incomplete 😅

all other parameters [blank].. what?

Copy link
Collaborator

@antimora antimora left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

I had one question you should see.

Comment on lines 27 to 38
pub fn init_with<B: Backend>(&self, record: BenchmarkModuleRecord<B>) -> BenchmarkModule<B> {
BenchmarkModule {
linears: record
.linears
.into_iter()
.map(|record| nn::Linear {
weight: record.weight,
bias: record.bias,
})
.collect(),
}
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Getting rid off init_with and adding in other places? ;-)

Comment on lines 42 to 44
let running_mean = Tensor::zeros([self.num_features], device);
let running_var = Tensor::ones([self.num_features], device);

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any reason why it's different for running*?

Copy link
Collaborator

@antimora antimora left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to update the book?

@antimora antimora added the enhancement Enhance existing features label Mar 28, 2024
Copy link
Member

@laggui laggui left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Minor phrasing comments but approving in advance because it's nothing critical.

Comment on lines 5 to 6
You need two things in order to load weights for a model: the model's weights and the model's
config. Since parameters in Burn are lazy initialized, no allocation and GPU/CPU kernels are executed by
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rephrase to avoid using "weights" too many times?

Two things are required to load a model's weights: the model's record and the model's config.

@@ -40,29 40,18 @@ fn main() {

## Good practices

By using the Config pattern it is easy to create instances from this
config. Therefore, initialization methods should be implemented on the config struct.
By using the config type it is easy to create new module instances from it. The initialization
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

create new module instances from "it" sounds a bit weird. I think we can simply leave this out.

By using the config type it is easy to create new module instances.

@nathanielsimard nathanielsimard merged commit b0c5986 into main Apr 2, 2024
15 checks passed
@nathanielsimard nathanielsimard deleted the feat/lazy-init branch April 2, 2024 14:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Enhance existing features
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Improve module initialization.
4 participants