I'm learning data structure and frankly I have read about binary search in array and I know every thing about its concept. but I'm confused on this, while my array size isn't even then n/2 need to be bounded to upper.bound(n/2) or to down.bound(n/2) (for example 11/2 respectively will be after partitioning 6 or 5) and it will

**not affect the time complexity**! here I stuck in !

In every iteration I will partition my array of size n(assume n is odd) to n/ 2 and if we approximate n/2 to upper.bound(n/2) **assuming that** then we actually added about +1 to the partitioned ratio, I mean with that for example (11/2)+1 , and " +1 " I know asymptotic doesn't matter but what actually confused me it doesn't matter in the iteration itself but In every iteration I do after partitioning "+1" (upper bound of (n/2)) so the constant at the end is accumulated and not simple "+1", i.e after doing all iterations I will get +1+1+1+1+1+1+1 for all the sub-arrayes that I partitioned till I get stop, here we got accumulated constant so we musn't neglect it because it's not simple +1 (constant) it's about sum of, 1+1+1+1+1+1 to all iteration ... and we may get O(n)= n*1

**in total**...... so why are we neglecting that in time complexity of algorithm??! and we neglect it, why we are neglecting sum of constant? it might be dominant in total, wrong?!

what I understand from what I've read that I can't neglect accumulated constant ...so? />

it would be a pleasure to explain when we neglect the constant and when not ! thanks in advance and thank you for your cooperative.

This post has been edited by **macosxnerd101**: 05 January 2019 - 02:23 PM

Reason for edit:: Renamed title to be more descriptive. Your topic has nothing to do with a binary search tree.