-
Notifications
You must be signed in to change notification settings - Fork 12
Sandbox Architecture
- Package input params on host side
- Unpack input params on XPC side
- Execute operation
- Package results on XPC side
- Unpack results on host app side
- Execute completion block (which uses the results)
com.apple.security.assets.movies.read-only
com.apple.security.assets.music.read-only
com.apple.security.assets.pictures.read-only
- The above three are all very logical, but what about bookmarks? Where exactly are we reading them from presently?
- As I understand it, if someone drags a folder into iMedia to add it to the source list, it's automatically added to our sandbox for as long as the app is running. There's no public API for maintaining that access across launches yet
- --> added since this was written, but thought I'd add that there is now a public API for this as of 10.7.3: Security-Scoped Bookmarks and Persistent Resource Access
-
com.apple.security.network.client
for Flickr
On the XPC bundle side (IMBParser):
-(NSError*) populateNode:(IMBNode*)inNode;
On the host app side (IMBLibraryController):
-(void) populateNode:(IMBNode*)inOldNode completionBlock:(void(^)(IMBNode* inNewNode,NSError* inError))inCompletionBlock;
Note that packing/unpacking may be shortcut here by copying the node. This version demonstrates symmetry with the other implementations.
// Pack
NSData* oldNodeData = [NSKeyedArchiver archivedDataWithRootObject:inOldNode];
dispatch_async(_backgroundQueue,^()
{
// Unpack
IMBNode* node = [NSKeyedUnarchiver objectWithData:oldNodeData];
// Execute
IMBParser* parser = [node parser];
NSError* error = [parser populateNode:node error:&error];
// Pack
NSData* newNodeData = [NSKeyedArchiver archiveDataWithRootObject:node];
NSData* errorData = [NSKeyedArchiver archiveDataWithRootObject:error];
// Unpack
node = [NSKeyedUnarchiver unarchiveObjectWithData:newNodeData];
error = [NSKeyedUnarchiver unarchiveObjectWithData:errorData];
dispatch_async(dispatch_get_main_queue(),^()
{
// Deliver result
inCompletionBlock(node,error);
});
});
XPCConnection* connection = [[XPCConnection alloc] initWithServiceName:@"com.karelia.imedia.xpc"];
connection.eventHandler = ^(NSDictionary* inMessage, XPCConnection* inConnection)
{
// Unpack
NSData* newNodeData = [inMessage objectForKey:@"node"];
NSData* errorData = [inMessage objectForKey:@"error"];
IMBNode* newNode = [NSKeyedUnarchiver unarchiveObjectWithData:newNodeData];
NSError* error = [NSKeyedUnarchiver unarchiveObjectWithData:errorData];
dispatch_async(dispatch_get_main_queue(),^()
{
// Deliver result
inCompletionBlock(newNode,error);
});
};
// Pack
NSData* oldNodeData = [NSKeyedArchiver archivedDataWithRootObject:inOldNode];
NSDictionary* message = [NSDictionary dictionaryWithObjectsAndKeys:
@"populateNode",@"operation",
oldNodeData,@"node",
nil];
[connection sendMessage:message];
int main(int argc, const char *argv[])
{
[XPCService runServiceWithConnectionHandler:^(XPCConnection* inConnection)
{
[inConnection setEventHandler:^(NSDictionary* inMessage, XPCConnection* inConnection)
{
NSString* operation = [inMessage objectForKey:@"operation"];
if ([operation isEqual:@"populate"])
{
// Unpack
NSData* nodeData = [inMessage objectForKey:@"node"];
IMBNode* node = [NSKeyedUnarchiver unarchiveObjectWithData:nodeData];
// Execute
IMBParser* parser = [node parser];
NSError* error = [parser populateNode:node error:&error];
// Pack
NSData* newNodeData = [NSKeyedArchiver archiveDataWithRootObject:node];
NSData* errorData = [NSKeyedArchiver archiveDataWithRootObject:error];
NSDictionary* result = [NSDictionary dictionaryWithObjectsAndKeys:
newNodeData,@"node",
errorData,@"error",
nil];
[connection sendMessage:result];
}
}];
}];
dispatch_main();
return 0;
}
- Small memory footprint
- Minimum possible CPU load
=> Lazy loading:
- Only load node at a time, load more as disclosure triangles are expanded
- Only load thumbnails/metadata of objects that are currently visible
- Load more as objects are scrolled into view
-
Manage and instantiate parser classes (IMBParserController)
- Will be moved to XPC bundle
- Accessing parsers from the host app will require crossing XPC boundary with new asyncronous API
-
Singleton controller for each media type (IMBLibraryController)
- "Owns" the data model (tree of IMBNodes)
- Will stay in the host app, because the UI is bound to this controller
-
Possibly multiple user interfaces (hooked up via bindings to a single IMBLibraryController)
- IMBNodeViewController is bound to the IMBLibraryController (Master view)
- IMBObjectViewController is bound to the selection of IMBNodeViewController (Detail view)
Maybe we should also rethink the strategy for standard vs custom IMBObjectViewControllers. Currently that handling is somewhat complicated.
-
Manage Parsers
- Decide which parsers are allowed to operate. Currently done via delegate methods. Might be more complicated with XPC.
- Discovery of media. Parsers need broad entitlements for that, so they need to live in an XPC bundle, not the host app.
-
Create & Populate Data Model
-
Create empty (unpopulated) nodes. Often root nodes.
-
Expand node to populate subnodes.
-
Select node to populate objects.
-
Reload a node (tree) when something important has changed. This can be triggered by the user (context menu command) or a file system event.
Currently we have to methods in IMBParser to take care of the four tasks mentioned above. Maybe we should split that up into four methods to better handle each specific task. That would probably also help to make the architecture more self-explanatory. The new architecture needs to be completely asynchronous. The IMBLibraryController needs to be robust and must be able integrate results from background operations in the correct location. Current problems with leaks or zombie crashes need to be tracked down and solved. Reloading triggered by file system events should do as little work as possible, e.g. when a subfolder of ~/Pictures needs to reloaded, don't reload anything else, but try to keep the model objects intact there.
-
-
Media File Access
-
Loading of thumbnails
-
Loading of metadata
-
Loading of complete media data (generic NSData can be used for all media types, but CGImageRef and other image types should be available for visual data).
This is a new part of the architecture that is mandatory for sandboxed apps, but can be ignored by non-sandboxed apps (i.e. they can access media files directly as they are not restricted in any way). What should be do about movie or audio files? Pushing 2GB NSData objects across XPC boundaries is probably not a good idea? Will inline movie previews in the IKImageBrowser still work? Is IOSurface an option here?
-