Mill Computing, Inc. Forums The Mill Architecture Can branch predictors perfectly predict counting loops? Reply To: Can branch predictors perfectly predict counting loops?

Ivan Godard
Keymaster
Post count: 689

Your idea is possible in principle, but the details are problematic. In essence you propose a deferred branch whose transfer is detached from the evaluation of the predicate, similar to the way Mill separates load issue from load retire.

The deferred branch, DB, would contain a predicate, a target address, and a deferral cycle count. At DB issue time the predictor has already made an exit prediction and fetch has already run down the prediction chain loading the code of the predicted path. If the DB predicate evals false then the DB is ignored. If it evals true then the DB target address should be compared with the predicted target and the deferral with the remaining count of the prediction; if equal then the DB is asserting the same transfer as the prediction, and can be ignored. If they differ then the fetch/decode logic needs to be reset, essentially in the same way as with a mispredict, by updating the pending count and target and restarting the fetch chain at the new target.

This could be done. However there are both semantic and implementation issues. One is the time required to reset the fetch. If the target is not in the I0 microcache then reset would take roughly as long as mispredict recovery, i.e. five cycles in our test configs. Even an I0 hit would likely need three cycles to reset. If the deferral was less than the remaining count as predicted then then we would have already executed a ways down the predicted (before the reset) path, and would need a full unwind miss recovery, and DB buys nothing. How often can we eval a predicate five cycles before the transfer? Not often I’d guess, but I have no numbers.

A semantic issue is how the DB interacts with other branches. Suppose a regular branch is executed between DB issue/eval and transfer? Who wins? I’m sure you can think of similar conundrums.