internal/encoding/json: fix performance cliff when decoding large integers that will go out of range.

For large positive integers, add check for number of decimal digits
before converting number to plain integer w/o exponent.

If exponent value is large, previous implementation may end up
constructing a large string with lots of zeroes that is not useful as it
will fail later on when called with strconv.Parse{Uint,Int} anyways.

Fixes golang/protobuf#1002.

Change-Id: I65bfad304401e076743853d7501786b7231b083b
Reviewed-on: https://go-review.googlesource.com/c/protobuf/+/213717
Reviewed-by: Damien Neil <dneil@google.com>
diff --git a/internal/encoding/json/number.go b/internal/encoding/json/number.go
index 27faa69..529331f 100644
--- a/internal/encoding/json/number.go
+++ b/internal/encoding/json/number.go
@@ -239,6 +239,14 @@
 			return "", false
 		}
 
+		// Make sure resulting digits are within max value limit to avoid
+		// unnecessarily constructing a large byte slice that may simply fail
+		// later on.
+		const maxDigits = 20 // Max uint64 value has 20 decimal digits.
+		if intpSize+exp > maxDigits {
+			return "", false
+		}
+
 		// Set cap to make a copy of integer part when appended.
 		num = n.intp[:len(n.intp):len(n.intp)]
 		num = append(num, n.frac...)