This app http://ift.tt/OCnqDb lets users login and take and post photos to a stream screen. When the user clicks a photo, a segue is triggered and you see a full size photo.
The app uses a mysql backend. Every user has a Iduser, username, and password. Each photo is in a separate table in the database and is connected to the Iduser that uploaded it, via a php/objective-c/json method combo. The photos are in a table and has an identifier of Idphoto.
When Idphoto is currently clicked this segue is preformed and the user is transitioned to a new viewcontroller that shows the photo fullscreen.
-(void)didSelectPhoto:(PhotoView*)sender {
//photo selected - show it full screen
[self performSegueWithIdentifier:@"ShowPhoto" sender:[NSNumber numberWithInt:sender.tag]];
}
-(void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender {
if ([@"ShowPhoto" compare: segue.identifier]==NSOrderedSame) {
StreamPhotoScreen* streamPhotoScreen = segue.destinationViewController;
streamPhotoScreen.IdPhoto = sender;
}
}
The above code goes to this viewcontroller and this code
#import "StreamPhotoScreen.h"
#import "API.h"
@implementation StreamPhotoScreen
@synthesize IdPhoto;
-(void)viewDidLoad {
API* api = [API sharedInstance];
//load the caption of the selected photo
[api commandWithParams:[NSMutableDictionary dictionaryWithObjectsAndKeys:@"stream", @"command", IdPhoto,@"IdPhoto", nil] onCompletion:^(NSDictionary *json) {
//show the text in the label
NSArray* list = [json objectForKey:@"result"];
NSDictionary* photo = [list objectAtIndex:0];
lblTitle.text = [photo objectForKey:@"title"];
}];
//load the big size photo
NSURL* imageURL = [api urlForImageWithId:IdPhoto isThumb:NO];
[photoView setImageWithURL: imageURL];
}
-(BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation {
// Return YES for supported orientations
return (interfaceOrientation == UIInterfaceOrientationPortrait);
}
-(void)viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
}
@end
Sidenote: I changed the UIImagePicker instance in the native app to a more customizable AVCam instance.
Question: how can we use the below avcam method to tap into the UIDevice currentDevice]
so that when a user taps a native IdPhoto jpegrepresentation on the stream screen, The Iduser of who's photo was clicked, camera hardware triggers via an extension of the below avcam method and shows it to the clicker in the viewcontroller in the place of the segue above.
- (void)snapStillImage //this is being called on did load
{
dispatch_async([self sessionQueue], ^{
// Update the orientation on the still image output video connection before capturing.
[[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];
// Flash set to Auto for Still Capture
[ViewController5 setFlashMode:AVCaptureFlashModeAuto forDevice:[[self videoDeviceInput] device]];
// Capture a still image.
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:[[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo] completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
photo.image = [[UIImage alloc] initWithData:imageData];
[[[ALAssetsLibrary alloc] init] writeImageToSavedPhotosAlbum:[photo.image CGImage] orientation:(ALAssetOrientation)[photo.image imageOrientation] completionBlock:nil];
[self uploadPhoto];
}
}];
});
}
I want to achieve the above using any methodology available if anyone has any references or code I can work with.
Aucun commentaire:
Enregistrer un commentaire