We are thinking of trickling in indexing/publishing pages gradually because of an experiment we are performing.
I am trying to understand how this this will hurt our SEO. According to a Google Hangout video titled “English Google Webmaster Central office-hours hangout” featuring John Mueller in 2017, at the 19:54 mark he states that
“artificially introducing a kind of a trickle into the index is something that often causes more problems than it solves anything.”
I am trying to understand why. My assumption is that Google has an internal index they use to map the internet. And by a slow trickle the crawler will have to rewrite and validate those routes over and over again.
Any thoughts on this would be very helpful. Thank you.