I imagine you’re right, that they’re NString indexes packaged up into a 
frustrating return type. After sleeping on it, though, I imagined that even if 
complex grapheme clusters WERE to make count( attrStr.string ) return a 
different result than attrStr.length, it would probably never be due to 
whitespace. So if I go back to Charles Strstka’s original suggestion, where you 
pull off one character at a time, convert it to a 1-Character string, and then 
test for whitespace membership, I should be able to count leading and trailing 
whitespace characters and then do math based on attrStr.length to create the 
range.

Here’s my current playground: 

import Cocoa

extension Character {

  func isMemberOfSet( set:NSCharacterSet )
    -> Bool
  {
    // The for loop only executes once;
    // its purpose is to convert Character to a type
    // you can actually do something with
    for char in String( self ).utf16 {
      if set.characterIsMember( char ) {
        return true
      }
    }
    return false
  }

}

var result:NSRange

let whitespace = NSCharacterSet.whitespaceAndNewlineCharacterSet()

let attrStr = NSAttributedString( string:"    Fourscore and seven years ago... 
\n\n \t\t" )
let str = attrStr.string

var headCount = 0
var tailCount = 0

var startIx = str.startIndex
var endIx = str.endIndex

while endIx > startIx && str[ endIx.predecessor() ].isMemberOfSet( whitespace ) 
{
  ++tailCount
  endIx = endIx.predecessor()
}
if endIx > startIx {
  while str[ startIx ].isMemberOfSet( whitespace ) {
    ++headCount
    startIx = startIx.successor()
  }
  let length = attrStr.length - ( headCount + tailCount )
  result = NSRange( location:headCount, length:length )
} else {
  // String was empty or all whitespace
  result = NSRange( location:0, length:0 )
}

let resultString = attrStr.attributedSubstringFromRange( result )


— 

Charles

On April 2, 2015 at 11:16:52 PM, Quincey Morris 
(quinceymor...@rivergatesoftware.com) wrote:

On Apr 2, 2015, at 19:28 , Charles Jenkins <cejw...@gmail.com> wrote:

I can indeed call attrStr.string.rangeOfCharacterFromSet(). But in typical 
Swift string fashion, the return type is as unfriendly as possible: 
Range<String.Index>? — as if the NSString were a Swift string.

I finally read the whole of what you said here, and I had to run to a 
playground to check:

import Cocoa

var strA = "Hello?, String”
var strB = "Hello?, String" as NSString
var strC = "Hello\u{1f650}, String”
var strD = "Hello\u{1f650}, NSString" as NSString
var rangeA = 
strA.rangeOfCharacterFromSet(NSCharacterSet.whitespaceCharacterSet()) // {Some 
“7..<8”}
var rangeB = 
strB.rangeOfCharacterFromSet(NSCharacterSet.whitespaceCharacterSet()) // (7,1)
var rangeC = 
strC.rangeOfCharacterFromSet(NSCharacterSet.whitespaceCharacterSet()) // {Some 
“8..<9”}
var rangeD = 
strD.rangeOfCharacterFromSet(NSCharacterSet.whitespaceCharacterSet()) // (8,1)

So, yes, these are NSString indexes all the way, even if the result is packaged 
as a Range<String.Index>.

_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to