[...] It used to work reasonably, but regressed in Xcode 7 (<rdar://21306753>), and in Xcode 8 it works well again. I'm able to have it on in my largish Obj-C codebase with no warnings. You might want to try it again. Cheers,
[...] âStrict Selector Matchingâ? I agree, itâs infeasible to use that in any real-world code. However, there is a static analyzer flag âMethod Signatures Mismatchâ that IIRC does this same type of checking without the false alarms. âJens
On Oct 18, 2016, at 10:56 , Patrick Collins <email@hidden> wrote: > quoted text I donât know the answer exactly, because it depends on a number of factors, possibly including random values left over from previous instructions. Depending on the circumstances at each call site, 8 bytes are being passed and 4 of them ignored, or 4 bytes are being passed and an additional 4 bytes of unrelated data are being accessed as part of the value. If it worked at all on 64-bit iPhones, then you should treat that as mere luck (or, really, *bad* luck). BTW, I believe the ARM hardware in iPhones can run both 32-bit and 64-bit executables (for compatibility with very old apps). Itâs not outside the bounds of credibility that you might be building only 32-bit versions for that hardware. That is, if this project is pretty old, your build settings may be incorrect. Or, the graphics framework youâre linking to might be 32-bit. Diagnosing problems at that level is way beyond my pay grade. > quoted text Yikes, thatâs horrible. However, in most cases youâll get reasonable results because the compiler will provide value conversions. It fails when you lie to the compiler about the parameter and return types, or when the compiler canât determine the correct types. On Oct 18, 2016, at 10:56 , Wim Lewis <email@hidden> wrote: > quoted text There is a compiler warning that can be turned on for this (one of the âmismatched selectorâ warnings), but in my experience there are too many false negatives *and* false positives, making the warning useless.
On Oct 18, 2016, at 1:56 AM, Patrick J. Collins <email@hidden> wrote: > quoted text To expand on Quincey's explanation, what's going on here is that the "cast" syntax in the debugger is doing something different from what a cast in sourcecode would do. [...]
On Oct 18, 2016, at 01:56 , Patrick J. Collins <email@hidden> wrote: > quoted text Exactly so. The bug is that you declared âsetScaleâ in your protocol with the wrong return type. Itâs a weakness of Obj-C that its dynamism prevents the compiler from statically checking return types in all cases, and float vs. double is one of the particularly dangerous cases. If your game library uses float consistently, you should generally use float throughout your code, too. If you had been using Sprite Kit, you would generally use CGFloat throughout your code, because thatâs what it uses. Note that using the âwrongâ type in expressions and assignments is not a problem, because the compiler emits code for the conversion for you. So, something like this: int /* or: float */ _one_ = 1; item.scale = one; works however âscaleâ is declared because âone" is converted to the correct floating point type. The issue only occurs when a caller is expecting a different type from what the callee (unknown to the caller) returns.
[...] Err sorry, I mean casting as a float does not. Patrick J. Collins http://collinatorstudios.com
[...] Hmm.. I guess I don't understand that then. My logic was because casting as a double returns the expected value, and casting as a CGFloat does not, that it couldn't be a double... [...] The scale code I discovered was breaking due to a protocol's interface [...]
[...] Youâre having a little temper fit here, which is fine by me except that itâs leading you into technical incorrectness, not to mention contradicting yourself. You said: [...] That is, according to you, casting *as a float* produces 0, and casting as CGFloat or double [ [...]
[...] I'm not sure what you mean? The app is intended to work on both 32-bit and 64-bit architectures. My iPad3 is a 32-bit device, and my iPhone-6 is a 64-bit device. On both of those devices everything is fine and casting 1.0 as either a CGFloat or float with both result in 1.0 (as expected). [...]
[...] Is your process 64-bit or 32-bit? CGFloat is float in 32-bit processes and double in 64-bit processes. I don't think iPad Pro differs from any other 64-bit iOS device. Do you get the same result if you run that code in your app instead of on the debugger console? [...]
I should add-- this makes me so scared to release my application because there could be countless places where numbers are accidentally getting turned to 0 when they shouldn't be as CGFloat is used all over the place. And also I should say this behavior does NOT happen on any other of my [...]
I just got an iPad pro over the weekend to make sure my app worked on it, and sure enough, it didn't... After spending a ton of time debugging, I finally figured out what the deal is, and honestly, I do not understand it at all. Here is the problem: (lldb) expr (float)[self. [...]
Patrick, On 24. 9. 2016, at 17:11, Patrick J. Collins <email@hidden> wrote: [...] Note please that this kind of design is conceptually wrong and extremely error-prone, as Manoah did point out: *never* do this, unless *very* necessary and no cleaner way out. [...]
[...] Youâre another victim of auto property synthesis. By default, if you declare a property in your @interface, but donât put in any sort of implementation, the compiler will automatically decide you meant to @synthesize it, and create an ivar and a getter and setter. [...]
And--- as usual, moments after asking a question, something occurs to me that makes me identify the cause of the odd behavior, except I still don't quite understand it.. but at least I was able to fix it. The problem was, I was also subclassing. I had accidentally redefined the same propery in the subclass... Though I wasn't overwriting the getter/setter: @interface FancyList : List @property (nonatomic, weak) Cell *currentCell; @end @implementation @end And this for some reason caused the base classes ivar to never get set? Somehow it got confused and wasn't able to access the real ivar? or something? Anyway.. deleting that property in the subclass fixes the issue. Patrick J. Collins http://collinatorstudios.com On Sat, 8 Oct 2016, Patrick J. Collins wrote: [...]
Every once in a while, I experience behavior that really throws me off and makes me realize I do not fully understand ARC. I just came across one of these situations. I have some code that does something like: @interface List () @property (nonatomic, strong) NSArray *cells; [...]