vendredi 8 mai 2015

iOS using UIGraphicsGetImageFromCurrentImageContext to detect certain pixels on screen

I have a UIView where the user can draw various UIBezierPaths.

I need to analyze the drawn BezierPaths to process certain patterns. I have not found any solution to converting UIBezierPaths to a list of coordinates/points, it seems like this is undoable? It's strange as this data must be stored and used someway to draw the actual paths..

So to bypass this problem i decided to draw the BezierPath with a width of 1px:

[path setLineWidth:1];

And convert my UIView to an UIImage:

UIGraphicsBeginImageContextWithOptions(self.bounds.size, NO, 0.0);
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();

Then i can identify pixels by getting the color for a certain pixel position at the image:

- (UIColor *)colorAtPixel:(CGPoint)point {
    CGImageRef imageRef = [self CGImage];
    NSUInteger width = CGImageGetWidth(imageRef);
    NSUInteger height = CGImageGetHeight(imageRef);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    int bytesPerPixel = 4;
    long bytesPerRow = bytesPerPixel * width;
    int bitsPerComponent = 8;

    unsigned char *rawData = (unsigned char*) calloc(height * width * 4, sizeof(unsigned char));

    CGContextRef context = CGBitmapContextCreate(rawData,
                                                 width,
                                                 height,
                                                 bitsPerComponent,
                                                 bytesPerRow,
                                                 colorSpace,
                                                 kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);

    CGColorSpaceRelease(colorSpace);
    CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
    CGContextRelease(context);

    // Now your rawData contains the image data in the RGBA8888 pixel format.
    long byteIndex = (bytesPerRow * point.y) + point.x * bytesPerPixel;
    CGFloat red   = (rawData[byteIndex] * 1.0) / 255.0;
    CGFloat green = (rawData[byteIndex + 1] * 1.0) / 255.0 ;
    CGFloat blue  = (rawData[byteIndex + 2] * 1.0) / 255.0 ;
    CGFloat alpha = (rawData[byteIndex + 3] * 1.0) / 255.0;

    byteIndex += 4;
    UIColor *color = [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
    free(rawData);

    return color;
}

Now my issue is that the image generated is blurry, if a draw a straight 1px BezierPath line and convert this to an UIImage, the line has a width of 3x because of it becomming blurry.

How can i solve this? Is there actually no possible way to convert BezierPaths to a list of coordinates?

Validate an input as JSON notation on iOS

I have an input field on an iOS app that accepts an NSString value. I want to be able to validate the input as a JSON object. For example:

NSString = @"{'foo':'bar'}" //would be validated as JSON notation
NSString = @"Hello world!" //would NOT be validated as JSON

I have tried using the following method:

[NSJSONSerialization isValidJSONObject:(id)obj]

However, it always returns false, even if the string input is something like {'hello':'world'}. Is there anything I'm doing wrong or missing here?

Comparing UI object in ios

I am making a super class to TableView, I will have static TextView in it and also I have TextView in cells.

Then I starting edit my TextView in Cell I assign it to static property of TableView.

Now I need go thought cells subviews, compare 2 textView (static and textView in cell) and find cell.

How can I compare 2 textViews?

I can not compare its texts because then some text did typed I lost event.

Django rotates iphone image after upload

I'm working on a photo website where I want the user to be able to upload a portrait or landscape oriented photo. The maximum width should be 1250px, but the maximum height could be 1667px if it's in portrait mode. When I upload photos in portrait orientation, they show up rotated 90 degrees to the left. Is there a way using Pillow to make sure the photo stays in the correct orientation?

This is my code:

class Result(models.Model):
    result01        = models.FileField(upload_to=get_upload_file_name, null=True, blank=True)
    result01thumb   = models.FileField(upload_to=get_upload_file_name, null=True, blank=True)

    def save(self):
        super(Result, self).save()
        if self.result01:
            size = 1667, 1250
            image = Image.open(self.result01)
            image.thumbnail(size, Image.ANTIALIAS)
            fh = storage.open(self.result01.name, "w")
            format = 'png'
            image.save(fh, format)
            fh.close()

It's important that users be able to upload photos from their phones while they're mobile, so the correct orientation is really important. Is there anything I can do here?

Apple Mach-O linker Error Xcode 6.2

i'm sorry if this question is repeated, but i only asked because i didn't find a solution in other related questions.

I started a new project in Xcode 6.2

Imported AFNetworking using pods.

Imported SWRevealViewController by dragging the two files (.h and .m) into the project.

Everything looks fine but when i build the project to test it gives me this insane error below.

I'm going crazy with this error.

Please see the image below:

Error Screenshot

Does anyone know how to deal with this?

Thanks in advance.

Setting Device Independent RGB on iOS

I am detecting the rgb of a tapped pixel. Different iPads return slightly different RGB values. I maintain a plist of the different values returned per device and when the app opens it determines the device I am on and uses appropriate values. This is a terrible solution - but it does work.

I now want to fix this properly so I dived into colorspace on iOS. It seems I can use CGColorSpaceCreateCalibratedRGB to set a standard RGB regardless of device so the values returned are the same? Or do I need to convert the

However, I do not know any of the values needed to create a standard color space across devices so my pixel color return values are always the same - or if this is possible.

Some current example return values: iPad 2 r31 g0 b133 a1 iPad Air r30 g0 b132 a1

Can anyone help 'normalize' the pixel return value in a device independent way?

- (UIColor*) getPixelColorAtLocation:(CGPoint)point {
UIColor* color = nil;
CGImageRef inImage = self.image.CGImage;
// Create off screen bitmap context to draw the image into. Format ARGB is 4 bytes for each pixel: Alpha, Red, Green, Blue
CGContextRef cgctx = [self createARGBBitmapContextFromImage:inImage];
if (cgctx == NULL) { return nil; /* error */ }

size_t w = CGImageGetWidth(inImage);
size_t h = CGImageGetHeight(inImage);
CGRect rect = {{0,0},{w,h}}; 

// Draw the image to the bitmap context. Once we draw, the memory 
// allocated for the context for rendering will then contain the 
// raw image data in the specified color space.
CGContextDrawImage(cgctx, rect, inImage); 

// Now we can get a pointer to the image data associated with the bitmap
// context.
unsigned char* data = CGBitmapContextGetData (cgctx);
if (data != NULL) {
    //offset locates the pixel in the data from x,y. 
    //4 for 4 bytes of data per pixel, w is width of one row of data.
    int offset = 4*((w*round(point.y))+round(point.x));
    int alpha =  data[offset]; 
    int red = data[offset+1]; 
    int green = data[offset+2]; 
    int blue = data[offset+3]; 
    ////NSLog(@"offset: %i colors: RGB A %i %i %i  %i",offset,red,green,blue,alpha);
    //NSLog(@"colors: RGB A %i %i %i  %i",red,green,blue,alpha);
    color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)];
}


// When finished, release the context
CGContextRelease(cgctx); 
// Free image data memory for the context
if (data) { free(data); }

return color;

}

  • (CGContextRef) createARGBBitmapContextFromImage:(CGImageRef) inImage {

    CGContextRef context = NULL; CGColorSpaceRef colorSpace; void * bitmapData; int bitmapByteCount; int bitmapBytesPerRow;

    // Get image width, height. We'll use the entire image. size_t pixelsWide = CGImageGetWidth(inImage); size_t pixelsHigh = CGImageGetHeight(inImage);

    // Declare the number of bytes per row. Each pixel in the bitmap in this // example is represented by 4 bytes; 8 bits each of red, green, blue, and // alpha. bitmapBytesPerRow = (pixelsWide * 4); bitmapByteCount = (bitmapBytesPerRow * pixelsHigh);

    // Use the generic RGB color space. colorSpace = CGColorSpaceCreateDeviceRGB();

    if (colorSpace == NULL) { fprintf(stderr, "Error allocating color space\n"); return NULL; }

    // Allocate memory for image data. This is the destination in memory // where any drawing to the bitmap context will be rendered. bitmapData = malloc( bitmapByteCount ); if (bitmapData == NULL) { fprintf (stderr, "Memory not allocated!"); CGColorSpaceRelease( colorSpace ); return NULL; }

    // Create the bitmap context. We want pre-multiplied ARGB, 8-bits // per component. Regardless of what the source image format is // (CMYK, Grayscale, and so on) it will be converted over to the format // specified here by CGBitmapContextCreate. context = CGBitmapContextCreate (bitmapData, pixelsWide, pixelsHigh, 8, // bits per component bitmapBytesPerRow, colorSpace, kCGImageAlphaPremultipliedFirst); if (context == NULL) { free (bitmapData); fprintf (stderr, "Context not created!"); }

    // Make sure and release colorspace before returning CGColorSpaceRelease( colorSpace );

    return context; }

How to Recieve iOS Screen Lock Status

I am a noob just learning to code, so please bear with me.

I am trying to implement a location tracking application and here is my problem. I want to stop GPS updates when iPhone is locked or went into sleep mode.

I have used this discussion as reference point. This is what I have so far.

LockNotifierCallback.h

    #import <Foundation/Foundation.h>
@import CoreFoundation;

@interface LockNotifierCallback : NSObject

+ (void(*)(CFNotificationCenterRef center, void *observer, CFStringRef name, const void *object, CFDictionaryRef userInfo))notifierProc;

- (void)registerForDeviceLockNotifications;

@end

LockNotifierCallback.m

#import "LockNotifierCallback.h"
@import CoreFoundation;

static void lockcompleteChanged(CFNotificationCenterRef center, void *observer, CFStringRef name, const void *object, CFDictionaryRef userInfo) {
    CFStringRef nameCFString = (CFStringRef)name;
    NSString *notificationName =  (__bridge NSString*)nameCFString;
    NSLog(@"Darwin notification NAME = %@",name);

    NSString *timerStatus = @"No Change";
    if ([notificationName isEqualToString:@"com.apple.springboard.lockcomplete"]) {
        timerStatus = @"Yes";
    } else if ([notificationName isEqualToString:@"com.apple.springboard.lockstate"]) {
       timerStatus = @"No";
    }

    NSLog(@"success");
}

@implementation LockNotifierCallback;

+ (void(*)(CFNotificationCenterRef center, void *observer, CFStringRef name, const void *object, CFDictionaryRef userInfo))notifierProc {
    return lockcompleteChanged;
}

- (void)registerForDeviceLockNotifications;
{
    NSLog(@"registering for device lock notifications");

    CFNotificationCenterAddObserver(CFNotificationCenterGetDarwinNotifyCenter(), //center
                                    NULL, // observer
                                    lockcompleteChanged, // callback
                                    CFSTR("com.apple.springboard.lockcomplete"), // event name
                                    NULL, // object
                                    CFNotificationSuspensionBehaviorDeliverImmediately);

    CFNotificationCenterAddObserver(CFNotificationCenterGetDarwinNotifyCenter(), //center
                                    NULL, // observer
                                    lockcompleteChanged, // callback
                                    CFSTR("com.apple.springboard.lockstate"), // event name
                                    NULL, // object
                                    CFNotificationSuspensionBehaviorDeliverImmediately);
}


@end

Xcode compiles this without any error but I don't think it is working as I don't see any logs in Console when I run the app in simulator.

Its a SWIFT project and I have #import "LockNotifierCallback.h" in my bridging header.

Please answer this in detail as I am still learning to code.

Appreciate your help.

EDIT

I am using Objective C only to receive Darwin notification for lockstate (device sleep / awake) to optimize battery life as my app will run in the background, apart from this I am planning to implement location tracking and other functions in SWIFT.