**Problem**(verbatim):

Suppose that each row of an n x n array A consists of 1’s and 0’s such that in any row of A, all the 1’s come before any 0’s in that row. Assuming A is already in memory, describe a method running in O(n) time (not O(n2) time) for finding the row of A that contains the most 1's.

Yes, you read correctly. The problem in the book wants it in O(n) time! The absolute best I could get is O(n log n). My solution involved having an outer for loop that runs from 0 to n -1. The inner loop does a binary search on each row. So, the outer loop runs n times and the inner loop runs, at most, around log n times, which gives n*log n is O(n log n). Am I missing something? I mean, the only way for the algorithm to run at O(n) would be for there to only be constant time operations within the outer loop right? My instincts tell me it's a misprint, but I'd like some confirmation on that. Anyone?

Thanks