usability study – Is a repetitive, three seconds response time the absolute worst?

My very first IT manager back in the 90s once stated that the absolute worst repetitive response time an application can have is three seconds. His argument was that this is long enough to be a significant annoyance, and too short to get a (cognitive) break. So if you want to design a system for maximum frustration, make every action have an average response time of three seconds (and allow for some variation too, just to make it unpredictable as well).

This was my managers anecdotal input, but I have always thought of it as valid. And it seems to make sense given Jakob Nielsen’s thoughts on the matter.

Is there any research to back up (or invalidate) the claim? is a scam stay away the worst review for them

So OVH send me a DMCA for no reason from someone claiming i use his pics, i told them i sorted the issue and they just suspend me now i cant… | Read the rest of

PayPal is the WORST company on INTERNET!!! |

I’ve been using PayPal for a number of years now and I’ve never had any problem to be honest. I was using it for daily transactions and after a couple of weeks, processing around $500 I was asked to verify my account and I did.

A few months later after the PayPal account had been used for a couple of thousands worth if dollars, I was asked to prove my identity by sending either a copy of my driving licence or passport which I duely did, and I’ve never had to do anything since.

I did once have a security issue but that was dealt with quickly and efficiently and what money lost, was quickly refunded by them, so I can’t complain at all.

algorithm analysis – Finding the worst case running time of this piece of code?

I am working with this code:

function strange (list a(0..n-1) of integers such that abs(a(i)) ≤ n for every 0 ≤ i ≤ n - 1, list b(0..2n) of zeroes)

for i ← 0 to n - 1 do
       a(i) ← a(i) + n
for i ← 0 to n - 1 do
       for j ← 0 to abs(a(i) - 1) do 
              b(j) ← b(j) + 1
return b

I am trying to figure out the worst running time for the code above and so far I’m guessing that the first for loop will run n times, but not sure how to prove this. For the second and third for loop, I’m unsure how to approach this.
If possible, could someone help me solve this?

Difficulty understanding the use of arbitrary function for the worst case running time of an algorithm

In CLRS the author said

“Technically, it is an abuse to say that the running time of insertion sort is $O(n^2)$,
since for a given $n$, the actual running time varies, depending on the particular
input of size $n$. When we say “the running time is $O(n^2)$,” we mean that there is a
function $f(n)$ that is $O(n^2)$ such that for any value of $n$, no matter what particular
input of size $n$ is chosen, the running time on that input is bounded from above by
the value $f(n)$. Equivalently, we mean that the worst-case running time is $O(n^2)$.

What I have difficulties understanding is why did the author talked about an arbitrary function $f(n)$ instead of directly $n^2$.

I mean why didn’t the author wrote

“When we say “the running time is $O(n^2)$,” we mean that for any value of $n$, no matter what particular input of size $n$ is chosen, the running time on that input is bounded from above by
the value $cn^2$ for some +ve $c$ and sufficiently large n. Equivalently, we mean that the worst-case running time is $O(n^2)$“.

I have very limited understanding of this subject so please forgive me if my question is too basic.

asymptotics – Finding the Time Complexity – Worst Case (Big-Θ) – Array List, BST

Hi I’m a bit confused on how to find the time complexity of the following in the worst case in terms of big-Θ, I’ve figured out 1 and 2.

What is the worst-case time complexity, in terms of big-Θ, of each of these operations:
(1) insert an element in the array list = Θ(1)
(2) remove an element from the array list (e.g. remove an occurrence of the number 5) = Θ(n)

(3) remove the second element from the array list (i.e. the one in position 1)

(4)count the number of unique elements it contains (i.e. the number of elements excluding duplicates; e.g.(6,4,1,4,3) has 4 unique elements)

Suppose you have an initially empty array list with an underlying array of length 10. What is the length of the underlying array after:

(5) inserting 10 elements in the array list
(6) inserting 10 more elements in the array list (i.e. 20 elements in total)
(7) inserting 10000 more elements in the array list (i.e. 10020 elements in total)

What is the worst-case time complexity, in terms of big-Θ, of each of these operations on binary search trees:
(8) add an element in the tree (assuming that the tree is balanced)
(9) add an element in the tree (without assuming that the tree is balanced)
(10) find the largest element in the tree (assuming that the tree is balanced) After each operation, we should still have a valid heap.

What is Simple Uniform Hashing, and why searching a hashtable has complexity Θ(n) in the worst case

Can anyone explain nicely what Simple Uniform Hashing is, and why searching a hashtable has complexity Θ(n) in the worst case if we don’t have uniform hashing (where n is the number of elements in the hashtable)

End-to-end Schengen and D-type visas; what is the worst that could happen if I don’t re-enter the country?

Since D-Visas, that are issued to take up a residence, are authorized by the Foreigners Authority (Ausländerbehörde), they will know best what type of visa they have issued and how it can be used.

The Schengen Borders Code does require that somebody leaves and reenter the Schengen Area so that the compliance with the 90/180 days rule can be checked.

A National D-Visa can be issued to override this rule

  • AVwV AufenthG, Ziff. (after a National Visa ends)
  • AVwV AufenthG, Ziff. (before a National Visa starts)

You should check your D-Visa Remarks field to see if it (also) contains:

  • § 7 Absatz 1 Satz 3 AufenthG

If yes, then it is a clear cut case. The fact that the D-Visa starts the day after the C-Visa expires is also a sign that it was issued with this situation in mind.

Page 138/139 of Visahandbook
Berechnung der Bezugszeiträume bei Schengen-Visa/ Anrechnung von Voraufenthaltszeiten

(b) Rechtmäßiger Aufenthalt in Deutschland
Sowohl bei „Negativstaatern“ (Staatsangehörige der Drittländer, die in Anhang I der Verordnung (EG) Nummer 539/2001 des Rates zur Aufstellung der Liste der Drittländer, deren Staatsangehörige beim Überschreiten der Außengrenzen im Besitz eines Visums sein müssen) als auch bei „Positivstaatern“ (Staatsangehörigen der Drittländer, die in Anhang II der o. a. Verordnung aufgeführt sind) ist es grundsätzlich bei einem kurzfristigen Aufenthalt, der an einen Aufenthalt in Deutschland nach nationalem Recht anschließt (z.B. als Student), notwendig, dass erst eine Ausreise aus dem Schengen-Gebiet und eine anschließende Wiedereinreise erfolgt, damit die erforderlichen Einreisevoraussetzungen nach Art. 6 Abs. 1 Schengener Grenzkodex überprüft werden können.

Calculation of reference periods for Schengen visas / crediting of periods of previous residence

(b) Legal residence in Germany
Both in the case of “negative states” (nationals of third countries listed in Annex I of Regulation (EC) No. 539/2001 of the Council establishing the list of third countries whose nationals are in possession of a visa when crossing the external borders must) as well as in the case of “positive states” (nationals of third countries listed in Appendix II of the above regulation), it is generally necessary for a short-term stay that follows a stay in Germany according to national law (e.g. as a student) that an exit from the Schengen area and a subsequent re-entry take place so that the necessary entry requirements according to Art. 6 Para. 1 of the Schengen Borders Code can be checked.

Eine Ausnahme bildet die Erteilung einer für drei Monate gültigen Aufenthaltserlaubnis durch die Ausländerbehörde nach § 7 Absatz 1 Satz 3 AufenthG bei Vorliegen bestimmter Voraussetzungen. Hintergrund ist, dass man Ausländern den Aufwand einer aus rein formalen Gründen vorzunehmenden Aus- und Wiedereinreise ersparen möchte.

Zur weiteren Erläuterung siehe : AVwV AufenthG, Ziff. und

An exception is the issuance of a residence permit valid for three months by the immigration authorities in accordance with Section 7 (1) sentence 3 of the Residence Act if certain requirements are met. The background to this is that one would like to save foreigners the effort of leaving and re-entering the country for purely formal reasons.

For further explanation see: AVwV AufenthG, no. and


game design – Why has the Final Fantasy series largely changed for the worst (or JRPs/RPGs in general)?

From what many remembered as open-world, explorable, side-quest, challenging battles and tactics of many similar RPGS/JPRGs of the 90s to the 2000s even, it now largely seems like the genre — especially referencing to FF series since they are among the “top dogs” of it — have diminished. I get the impression that lots of what made the old games good is lost:

  • What was once more explorable of a main navigation element seems to have become more centered, linear, and/or restrictive. You can have nicer walking animations and prettier backgrounds, but the same “tunnel” like forward direction — or more aimless all-way walking potential in huge open areas replaces that special emphasis on simple old rooms (often smaller) with less to give graphically but more to give in a travel, explorative or more sensible approach than just “hunting” or “running around and grinding.” If you make a large area you should at least give different elements to it than just “lots of space.” If you scale up you need more of that “something” to scale up too — otherwise it’s more empty.

  • The old free-to-explore open-worlds/world maps, airship/flying ship/etc. mechanics (even re-visit mechanics) are almost always chopped down or implemented much less attractively (think how it started with FFX and then onwards — i.e. you can “explore” fast but it’s really largely watered down stuff/processes in doing such). The whole “open world” aspect to the classics is largely reduced to large areas/fields but no longer a blend of different terrains, sub-areas, sections or just the general nature/element of traveling/entering/exiting different areas rather than storyline/linear rules imposed on all areas/paths.

  • The battle system is definitely a hot topic, as some will tell you the new mechanics add some new flesh to the table while others feel the older system worked best and it’s been “slaughtered” merely at the attempt of “spicing up” something that people already liked for the most part/settled in with over time. The thing is — if the battle system is to be made “better” so to speak — it should try and maintain the same elements of what the skeleton of original battle systems were based on. As an example old turn-based games kept the same skeleton even when becoming “active time” battles where it’s every turn to grab for themselves the quickest. The idea was that you can maintain the same “skeletal base” of the mechanics and only tweak them better — but lots of newer stuff almost always tries to go completely a new path that strays away from this with new experimentation, impositions, rules, and/or unneeded “extra steps” at times too. Basically it’s like the game’s old and functioning system has been put less concern to while trying to “splice” its old DNA under the impression that you can supposedly better an old thing by going in a completely a new “frankenstein” direction rather than just sprucing up the initial base in a more specific/oriented/targeted manner that fulfills its initial life blood/base than trying

  • Always an extreme. Nowadays it seems games of this series are either too linear or not linear at all — there’s no longer a good balance between the two. For example one game may have so much explorable, massiveness to specific areas that you would be to get lost/tired/grinding excessively/etc. in one area to then go to the next one and rinse and repeat. On the other hand you can go super linear (think FFXIII for example) where everything is just “new area -> go straight -> battle -> story -> repeat” and such. What made the classics arguably more “wow” is the fact that the game — when it needs to — switches from storyline/linearity to open/some explorableness (to pique the natural exploring instinct) while going back to restriction when danger arose (defensive mechanism/protective inhibition) — because both of these angles match human behavior/etc. it suits gameplay. But if you make it either too open or too linear you force one side too long and it doesn’t align naturally with the cycle of human operability/engagement well enough to have proper “ups” and “downs.”

  • More “complex” systems or angles regarding leveling/power ups/etc. In old games it’s often fairly simple and straightforward to a large degree on how something more direct leads to a more expandable nature of said system to grow and keep delivering. What has replaced easy but expandable seems to be complex and rigid — more learning curves but less direction to go once you “have it.” Slowly I think the series has gone this way, possibly starting with FFXIII/around that era. You make something simple that expands as needed become complex that really doesn’t give much over time. Something like junctioning in FF8 starts simple but can scale up to cool stuff as the game goes on, especially with the addition of GFs to character stats and so on. In a game like FFXIII for example you can liken the “powering system” to weak remnants of FFX and FFXII in ways of both combat means and stat growth.

  • Games/scenes (probably applies to others outside this genre/series/etc. though) are now largely presented as cinematics/films with bits of gameplay as the only crux to break apart that concept of whether it’s innately a movie with gameplay or gameplay with cinematics (like older games of the series where “movies” in, say, the FMV form/class were much less emphasized as part of the overall game). “Cutscenes” in old FFs were mostly seamless or passive — now they are expected to emphasize more (due to the graphics) and “fill” a part of the game/impression as such rather than just be more of a seamless flow with only particular moments having more “weight” to them. In old FFs, how much of the story is lost removing the dialogue/locked moment/cutscenes? Now compare that to how much would be lost in modern games. If there is more to “lose” from the cutscenes overall then maybe they are relied on too much to shape the impression or experience of the game.

  • complexity theory – Worst Case for AVL Tree Balancing after Deletion

    After deleting a node in an AVL tree, self-balancing (zig-zag rotation or the left-right balancing) maintains O(logn) time that is not guaranteed in other unbalanced trees (like BST).

    The Balancing operation is said to be O(1).

    What is the worst case for balancing?

    Any specific type of tree providing the worst case?