Find in an array the next value greater than a predefined value

Asked

Viewed 565 times

4

I have the following array with forech set:

$getdepth = '{"result":"true","asks":[[20,13],[34,20],[30,8],[35,8],[4,40],"bids":[["18",22],["16",74],["70",99],["65",18],["1",15]]}';
$depth = json_decode($getdepth, true);
foreach ($depth['asks'] as $k_asks => $v_asks) {
$array_asks[$k_asks] = $v_asks[0];}

I wanted to know how to make a function to discover the next value in "asks" greater than "30". Visually analyzing, the solution is 34.

Note: The value-to-value test would be a solution, but the original array contains more than 2000 values, which makes comparison impracticable due to high processing load.

  • When you say "next" you mean the first (in order) or the lowest (like, if the 35 appeared before the 34, which of the two you would want)? And this operation will be performed only once, or several?

  • 1

    When I say near, I refer to ordering all values in ascending order and finding that which is greater than the definition (30) --> 4, 20, 30, 34, 35. It will be performed several times and the $getdepth array is not fixed changes every minute.

  • Okay. I wrote an answer. Unfortunately, I can’t do it with less than a full pop-up on the list. And if it changes all the time, the simplest solution will probably perform best in practice (unless you have to do hundreds or more different searches on the same list, before it changes).

1 answer

0


You will hardly get a better solution than testing value to value. A complete pass on the list is O(n), and that’s the limit you can reach - because in the worst case (i.e. the value you want does not exist or it is the last to be tested) you will have to evaluate all the elements of the list at least once. By the way, if what you want is a "Minimax" (i.e. the lowest value greater than 30 - and not the first on the list, in order, that is greater than 30) then you’ll have to make a full pass anyway (to ensure there’s no lower value ahead).

If this is an isolated operation, I’m afraid I have nothing better to suggest... But if this is something you need to do over and over again, then you can amortize search time in one of the two ways below:

  1. Sort the array once (O(n*log(n))) and, for each search you do, use the binary research (O(log(n))). For a single operation this will be slower (> O(n)), but for multiple operations will be faster (< O(m*n), provided that m > log(n) outside the constant factor).
  2. Place your elements in a binary search tree (same order of complexity of the above solution).

The above solutions apply to the "Minimax" case. In the other case, before applying the suggested solution create an array with the maximum values found up to each index:

[[20,13],      [34,20],      [30,8],      [35,8],      [4,40]     ]
[[20,[20,13]], [34,[34,20]], [34,[30,8]], [35,[35,8]], [35,[4,40]]]

And sort/search it instead of the original array.

P.S. Are you having performance problems in practice? 2000 elements does not seem to me something excessively large to justify a more complex solution...

  • Your answer is complex. The system runs on two servers, one that provides the information from the database (array) and the terminal (PC) that will run this routine and others that will also run at the same time. In addition to server/terminal connection lag.

  • @user3486019 Yes, my recommendation is not to mess with it not. It tests the value even, because it has good chance of the overall performance is better. Or, if feasible, do the time consuming part (sort the list or create the search tree, whichever is easier) on the server faster, and slow down the search itself.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.