Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

load PFIs from known source and cache as shared preference #17

Merged
merged 5 commits into from
Jan 26, 2024

Conversation

michaelneale
Copy link
Contributor

This will load the PFIs from an open github repo, caching them: TBD54566975/pfi-providers-data

It even shows a spinny circle the first time it is loading, which is kind of neat. Great typo on the branch name.

@michaelneale
Copy link
Contributor Author

I need to add to this the ability to quickly set a local DID for when using a local PFI

@michaelneale michaelneale marked this pull request as draft January 25, 2024 21:37
Copy link
Member

@wesbillman wesbillman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Amazing @michaelneale!!

This is a great idea! I was also thinking about maybe having a local version of pfis.json or something so we can try out new PFIs before officially releasing them to the pfi-providers-data repo. But this is a great start.

Comment on lines +45 to +51
return (json.decode(cachedData) as List).map((data) {
return Pfi(
id: data['id'] as String,
name: data['name'] as String,
didUri: data['didUri'] as String,
);
}).toList();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For this type of stuff, you can add toJson and fromJson to the type to centralize this logic. Then you can do Pfi.fromJson(data) in these 2 places here. That could might look like this in pfi.dart

  Map<String, dynamic> toJson() {
    return {
      'id': id,
      'name': name,
      'didUri': didUri,
    };
  }

  factory Pfi.fromJson(Map<String, dynamic> json) {
    return Pfi(
      id: json['id'],
      name: json['name'],
      didUri: json['didUri'],
    );
  }

Comment on lines 16 to 22
List<Pfi> pfis = (json.decode(response.body) as List).map((data) {
return Pfi(
id: data['id'] as String,
name: data['name'] as String,
didUri: data['didUri'] as String,
);
}).toList();
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With the serialization change below, you can update to this if you wanna:

final pfis = (json.decode(response.body) as List)
          .map<Pfi>((p) => Pfi.fromJson(p))
          .toList();

Comment on lines 24 to 26
// Save the data to cache
final prefs = await SharedPreferences.getInstance();
await prefs.setString(cacheKey, response.body);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can add a provider for this as well so that any code can use SharedPreferences without having to get an instance. It will be similar to how we setup secureStorage:
https://github.com/TBD54566975/didpay/blob/main/frontend/lib/services/service_providers.dart#L4

Then initialize and override it in main.dart:
https://github.com/TBD54566975/didpay/blob/main/frontend/lib/main.dart#L13-L24

Then when you need prefs you can use ref.read(sharedPreferencesProvider).setString(...)

);
final pfisProvider = FutureProvider<List<Pfi>>((ref) async {
const url = 'https://raw.githubusercontent.com/TBD54566975/pfi-providers-data/main/pfis.json';
const cacheKey = 'pfis_cache';
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

super nit: We can make this private to the file to avoid having to pass it to the _loadFromCache function below.

Comment on lines 29 to 36
} else {
// If server returns an unsuccessful response, try loading from cache
return await _loadFromCache(cacheKey);
}
} catch (e) {
// In case of an error, try loading from cache
return await _loadFromCache(cacheKey);
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another approach I like to use on mobile, to keep things snappy is to always load cache first, then always load from network. That way the users gets the list right away from cache and we can refresh from network after. This would require another provider/notifier which is a bit more complicated. Happy to show how that might work as well if you're interested.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good idea - did that.

),
)
],
body: pfisAsyncValue.when(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: but you can use the ref directly here. This has the advantage that only this part of the widget tree will rebuild when the value changes.

ref.watch(pfisProvider).when(....

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yep - did it

@wesbillman
Copy link
Member

I need to add to this the ability to quickly set a local DID for when using a local PFI

We can add some configs for prod, staging, and local etc.

Copy link
Contributor

@ethan-tbd ethan-tbd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

rly cool stuff!

@michaelneale michaelneale marked this pull request as ready for review January 26, 2024 09:27
@wesbillman wesbillman merged commit bfc3911 into main Jan 26, 2024
1 check passed
@wesbillman wesbillman deleted the dyanmic_pfis branch January 26, 2024 18:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants