Nintendo's Mariko tables result in trained frequency of 1599999 instead of 1600000.
PCV checks for rate == 1600000 exactly, when doing EMC init.
Thus EMC init does not succeed if we are trained to 1600000.
PCV has a fudge factor of 1000 used in SetEmcDvfsFreq, but this is not used in InitEmcDvfs.
This failure means that PCV cannot change rate back to 204MHz before sleep, and then after
wake extremely degraded performance is observed.
Restoring DRAM to 204MHz before boot causes EMC init to succeed/fixes performance degradation.
* ams: replace sept with tsec firmware
This replaces sept with a custom tsec key derivation firmware.
NOTE: This does not use any TSEC exploits whatsoever; it is a well-signed
TSEC binary assembled with envyas and signed with the real cauth key.
For more details, contact SciresM#0524.
* fusee: only set SBK if it's readable
Use 204 MHz as host clock in SDR104 mode instead of 136 MHz.
Due to this, also change the frequency init divider so the
initialization frequency is below 400 kHz.
This makes the clocks for SDMMC1 in all modes to match the TRM table.
Make it clear in the code that HS200/HS400 modes in fact use PLLP_OUT0
and not PLLC4_OUT2_LJ like the comment suggest. In fact selecting the
PLLC4_OUT2_LJ as clock source results in freeze after switching to
HS200/HS400 mode. This is most likely related to the PLLC4 not being
enabled, but it should be checked later.
Set the HS200/HS400 divider to 3, as this is what the code really did
set prior to this change - so this commit does not change that.
Configure Legacy 12 MHz clock to run at 12 MHz using the SW default
configuration (as per TRM) for the SDMMC legacy timer.
Introduce initial version of sdmmc_host_clock_delay() in order to use it
in places where the wait is host clock dependent. The way it is
implemented now does not change the sleep that was used instead.