PackWriter: Support reuse of entire packs
The most expensive part of packing a repository for transport to
another system is enumerating all of the objects in the repository.
Once this gets to the size of the linux-2.6 repository (1.8 million
objects), enumeration can take several CPU minutes and costs a lot
of temporary working set memory.
Teach PackWriter to efficiently reuse an existing "cached pack"
by answering a clone request with a thin pack followed by a larger
cached pack appended to the end. This requires the repository
owner to first construct the cached pack by hand, and record the
tip commits inside of $GIT_DIR/objects/info/cached-packs:
cd $GIT_DIR
root=$(git rev-parse master)
tmp=objects/.tmp-$$
names=$(echo $root | git pack-objects --keep-true-parents --revs $tmp)
for n in $names; do
chmod a-w $tmp-$n.pack $tmp-$n.idx
touch objects/pack/pack-$n.keep
mv $tmp-$n.pack objects/pack/pack-$n.pack
mv $tmp-$n.idx objects/pack/pack-$n.idx
done
(echo "+ $root";
for n in $names; do echo "P $n"; done;
echo) >>objects/info/cached-packs
git repack -a -d
When a clone request needs to include $root, the corresponding
cached pack will be copied as-is, rather than enumerating all of
the objects that are reachable from $root.
For a linux-2.6 kernel repository that should be about 376 MiB,
the above process creates two packs of 368 MiB and 38 MiB[1].
This is a local disk usage increase of ~26 MiB, due to reduced
delta compression between the large cached pack and the smaller
recent activity pack. The overhead is similar to 1 full copy of
the compressed project sources.
With this cached pack in hand, JGit daemon completes a clone request
in 1m17s less time, but a slightly larger data transfer (+2.39 MiB):
Before:
remote: Counting objects: 1861830, done
remote: Finding sources: 100% (1861830/1861830)
remote: Getting sizes: 100% (88243/88243)
remote: Compressing objects: 100% (88184/88184)
Receiving objects: 100% (1861830/1861830), 376.01 MiB | 19.01 MiB/s, done.
remote: Total 1861830 (delta 4706), reused 1851053 (delta 1553844)
Resolving deltas: 100% (1564621/1564621), done.
real 3m19.005s
After:
remote: Counting objects: 1601, done
remote: Counting objects: 1828460, done
remote: Finding sources: 100% (50475/50475)
remote: Getting sizes: 100% (18843/18843)
remote: Compressing objects: 100% (7585/7585)
remote: Total 1861830 (delta 2407), reused 1856197 (delta 37510)
Receiving objects: 100% (1861830/1861830), 378.40 MiB | 31.31 MiB/s, done.
Resolving deltas: 100% (1559477/1559477), done.
real 2m2.938s
Repository owners can periodically refresh their cached packs by
repacking their repository, folding all newer objects into a larger
cached pack. Since repacking is already considered to be a normal
Git maintenance activity, this isn't a very big burden.
[1] In this test $root was set back about two weeks.
Change-Id: Ib87131d5c4b5e8c5cacb0f4fe16ff4ece554734b
Signed-off-by: Shawn O. Pearce <spearce@spearce.org>
13 years ago Implement similarity based rename detection
Content similarity based rename detection is performed only after
a linear time detection is performed using exact content match on
the ObjectIds. Any names which were paired up during that exact
match phase are excluded from the inexact similarity based rename,
which reduces the space that must be considered.
During rename detection two entries cannot be marked as a rename
if they are different types of files. This prevents a symlink from
being renamed to a regular file, even if their blob content appears
to be similar, or is identical.
Efficiently comparing two files is performed by building up two
hash indexes and hashing lines or short blocks from each file,
counting the number of bytes that each line or block represents.
Instead of using a standard java.util.HashMap, we use a custom
open hashing scheme similiar to what we use in ObjecIdSubclassMap.
This permits us to have a very light-weight hash, with very little
memory overhead per cell stored.
As we only need two ints per record in the map (line/block key and
number of bytes), we collapse them into a single long inside of
a long array, making very efficient use of available memory when
we create the index table. We only need object headers for the
index structure itself, and the index table, but not per-cell.
This offers a massive space savings over using java.util.HashMap.
The score calculation is done by approximating how many bytes are
the same between the two inputs (which for a delta would be how much
is copied from the base into the result). The score is derived by
dividing the approximate number of bytes in common into the length
of the larger of the two input files.
Right now the SimilarityIndex table should average about 1/2 full,
which means we waste about 50% of our memory on empty entries
after we are done indexing a file and sort the table's contents.
If memory becomes an issue we could discard the table and copy all
records over to a new array that is properly sized.
Building the index requires O(M + N log N) time, where M is the
size of the input file in bytes, and N is the number of unique
lines/blocks in the file. The N log N time constraint comes
from the sort of the index table that is necessary to perform
linear time matching against another SimilarityIndex created for
a different file.
To actually perform the rename detection, a SxD matrix is created,
placing the sources (aka deletions) along one dimension and the
destinations (aka additions) along the other. A simple O(S x D)
loop examines every cell in this matrix.
A SimilarityIndex is built along the row and reused for each
column compare along that row, avoiding the costly index rebuild
at the row level. A future improvement would be to load a smaller
square matrix into SimilarityIndexes and process everything in that
sub-matrix before discarding the column dimension and moving down
to the next sub-matrix block along that same grid of rows.
An optional ProgressMonitor is permitted to be passed in, allowing
applications to see the progress of the detector as it works through
the matrix cells. This provides some indication of current status
for very long running renames.
The default line/block hash function used by the SimilarityIndex
may not be optimal, and may produce too many collisions. It is
borrowed from RawText's hash, which is used to quickly skip out of
a longer equality test if two lines have different hash functions.
We may need to refine this hash in the future, in order to minimize
the number of collisions we get on common source files.
Based on a handful of test commits in JGit (especially my own
recent rename repository refactoring series), this rename detector
produces output that is very close to C Git. The content similarity
scores are sometimes off by 1%, which is most probably caused by
our SimilarityIndex type using a different hash function than C
Git uses when it computes the delta size between any two objects
in the rename matrix.
Bug: 318504
Change-Id: I11dff969e8a2e4cf252636d857d2113053bdd9dc
Signed-off-by: Shawn O. Pearce <spearce@spearce.org>
14 years ago Implement similarity based rename detection
Content similarity based rename detection is performed only after
a linear time detection is performed using exact content match on
the ObjectIds. Any names which were paired up during that exact
match phase are excluded from the inexact similarity based rename,
which reduces the space that must be considered.
During rename detection two entries cannot be marked as a rename
if they are different types of files. This prevents a symlink from
being renamed to a regular file, even if their blob content appears
to be similar, or is identical.
Efficiently comparing two files is performed by building up two
hash indexes and hashing lines or short blocks from each file,
counting the number of bytes that each line or block represents.
Instead of using a standard java.util.HashMap, we use a custom
open hashing scheme similiar to what we use in ObjecIdSubclassMap.
This permits us to have a very light-weight hash, with very little
memory overhead per cell stored.
As we only need two ints per record in the map (line/block key and
number of bytes), we collapse them into a single long inside of
a long array, making very efficient use of available memory when
we create the index table. We only need object headers for the
index structure itself, and the index table, but not per-cell.
This offers a massive space savings over using java.util.HashMap.
The score calculation is done by approximating how many bytes are
the same between the two inputs (which for a delta would be how much
is copied from the base into the result). The score is derived by
dividing the approximate number of bytes in common into the length
of the larger of the two input files.
Right now the SimilarityIndex table should average about 1/2 full,
which means we waste about 50% of our memory on empty entries
after we are done indexing a file and sort the table's contents.
If memory becomes an issue we could discard the table and copy all
records over to a new array that is properly sized.
Building the index requires O(M + N log N) time, where M is the
size of the input file in bytes, and N is the number of unique
lines/blocks in the file. The N log N time constraint comes
from the sort of the index table that is necessary to perform
linear time matching against another SimilarityIndex created for
a different file.
To actually perform the rename detection, a SxD matrix is created,
placing the sources (aka deletions) along one dimension and the
destinations (aka additions) along the other. A simple O(S x D)
loop examines every cell in this matrix.
A SimilarityIndex is built along the row and reused for each
column compare along that row, avoiding the costly index rebuild
at the row level. A future improvement would be to load a smaller
square matrix into SimilarityIndexes and process everything in that
sub-matrix before discarding the column dimension and moving down
to the next sub-matrix block along that same grid of rows.
An optional ProgressMonitor is permitted to be passed in, allowing
applications to see the progress of the detector as it works through
the matrix cells. This provides some indication of current status
for very long running renames.
The default line/block hash function used by the SimilarityIndex
may not be optimal, and may produce too many collisions. It is
borrowed from RawText's hash, which is used to quickly skip out of
a longer equality test if two lines have different hash functions.
We may need to refine this hash in the future, in order to minimize
the number of collisions we get on common source files.
Based on a handful of test commits in JGit (especially my own
recent rename repository refactoring series), this rename detector
produces output that is very close to C Git. The content similarity
scores are sometimes off by 1%, which is most probably caused by
our SimilarityIndex type using a different hash function than C
Git uses when it computes the delta size between any two objects
in the rename matrix.
Bug: 318504
Change-Id: I11dff969e8a2e4cf252636d857d2113053bdd9dc
Signed-off-by: Shawn O. Pearce <spearce@spearce.org>
14 years ago |
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469 |
- DIRCChecksumMismatch=DIRC checksum mismatch
- DIRCExtensionIsTooLargeAt=DIRC extension {0} is too large at {1} bytes.
- DIRCExtensionNotSupportedByThisVersion=DIRC extension {0} not supported by this version.
- DIRCHasTooManyEntries=DIRC has too many entries.
- DIRCUnrecognizedExtendedFlags=Unrecognized extended flags: {0}
- JRELacksMD5Implementation=JRE lacks MD5 implementation
- URINotSupported=URI not supported: {0}
- URLNotFound={0} not found
- aNewObjectIdIsRequired=A NewObjectId is required.
- abbreviationLengthMustBeNonNegative=Abbreviation length must not be negative.
- abortingRebase=Aborting rebase: resetting to {0}
- abortingRebaseFailed=Could not abort rebase
- advertisementCameBefore=advertisement of {0}^{} came before {1}
- advertisementOfCameBefore=advertisement of {0}^{} came before {1}
- amazonS3ActionFailed={0} of '{1}' failed: {2} {3}
- amazonS3ActionFailedGivingUp={0} of '{1}' failed: Giving up after {2} attempts.
- ambiguousObjectAbbreviation=Object abbreviation {0} is ambiguous
- anExceptionOccurredWhileTryingToAddTheIdOfHEAD=An exception occurred while trying to add the Id of HEAD
- anSSHSessionHasBeenAlreadyCreated=An SSH session has been already created
- applyingCommit=Applying {0}
- atLeastOnePathIsRequired=At least one path is required.
- atLeastOnePatternIsRequired=At least one pattern is required.
- atLeastTwoFiltersNeeded=At least two filters needed.
- authenticationNotSupported=authentication not supported
- badBase64InputCharacterAt=Bad Base64 input character at {0} : {1} (decimal)
- badEntryDelimiter=Bad entry delimiter
- badEntryName=Bad entry name: {0}
- badEscape=Bad escape: {0}
- badGroupHeader=Bad group header
- badObjectType=Bad object type: {0}
- badSectionEntry=Bad section entry: {0}
- base64InputNotProperlyPadded=Base64 input not properly padded.
- baseLengthIncorrect=base length incorrect
- bareRepositoryNoWorkdirAndIndex=Bare Repository has neither a working tree, nor an index
- blobNotFound=Blob not found: {0}
- branchNameInvalid=Branch name {0} is not allowed
- blobNotFoundForPath=Blob not found: {0} for path: {1}
- cachedPacksPreventsIndexCreation=Using cached packs prevents index creation
- cannotBeCombined=Cannot be combined.
- cannotCombineTreeFilterWithRevFilter=Cannot combine TreeFilter {0} with RefFilter {1}.
- cannotCommitOnARepoWithState=Cannot commit on a repo with state: {0}
- cannotCommitWriteTo=Cannot commit write to {0}
- cannotConnectPipes=cannot connect pipes
- cannotConvertScriptToText=Cannot convert script to text
- cannotCreateConfig=cannot create config
- cannotCreateDirectory=Cannot create directory {0}
- cannotCreateHEAD=cannot create HEAD
- cannotDeleteCheckedOutBranch=Branch {0} is checked out and can not be deleted
- cannotDeleteFile=Cannot delete file: {0}
- cannotDeleteStaleTrackingRef2=Cannot delete stale tracking ref {0}: {1}
- cannotDeleteStaleTrackingRef=Cannot delete stale tracking ref {0}
- cannotDetermineProxyFor=Cannot determine proxy for {0}
- cannotDownload=Cannot download {0}
- cannotExecute=cannot execute: {0}
- cannotGet=Cannot get {0}
- cannotListRefs=cannot list refs
- cannotLock=Cannot lock {0}
- cannotLockFile=Cannot lock file {0}
- cannotLockPackIn=Cannot lock pack in {0}
- cannotMatchOnEmptyString=Cannot match on empty string.
- cannotMoveIndexTo=Cannot move index to {0}
- cannotMovePackTo=Cannot move pack to {0}
- cannotOpenService=cannot open {0}
- cannotParseGitURIish=Cannot parse Git URI-ish
- cannotPullOnARepoWithState=Cannot pull into a repository with state: {0}
- cannotRead=Cannot read {0}
- cannotReadBlob=Cannot read blob {0}
- cannotReadCommit=Cannot read commit {0}
- cannotReadFile=Cannot read file {0}
- cannotReadHEAD=cannot read HEAD: {0} {1}
- cannotReadObject=Cannot read object
- cannotReadTree=Cannot read tree {0}
- cannotRebaseWithoutCurrentHead=Can not rebase without a current HEAD
- cannotResolveLocalTrackingRefForUpdating=Cannot resolve local tracking ref {0} for updating.
- cannotStoreObjects=cannot store objects
- cannotUnloadAModifiedTree=Cannot unload a modified tree.
- cannotWorkWithOtherStagesThanZeroRightNow=Cannot work with other stages than zero right now. Won't write corrupt index.
- canOnlyCherryPickCommitsWithOneParent=Can only cherry-pick commits which have exactly one parent
- canOnlyRevertCommitsWithOneParent=Can only revert commits which have exactly one parent
- cantFindObjectInReversePackIndexForTheSpecifiedOffset=Can't find object in (reverse) pack index for the specified offset {0}
- cantPassMeATree=Can't pass me a tree!
- channelMustBeInRange0_255=channel {0} must be in range [0, 255]
- characterClassIsNotSupported=The character class {0} is not supported.
- checkoutConflictWithFile=Checkout conflict with file: {0}
- checkoutConflictWithFiles=Checkout conflict with files: {0}
- checkoutUnexpectedResult=Checkout returned unexpected result {0}
- classCastNotA=Not a {0}
- collisionOn=Collision on {0}
- commandWasCalledInTheWrongState=Command {0} was called in the wrong state
- commitAlreadyExists=exists {0}
- commitMessageNotSpecified=commit message not specified
- commitOnRepoWithoutHEADCurrentlyNotSupported=Commit on repo without HEAD currently not supported
- compressingObjects=Compressing objects
- connectionFailed=connection failed
- connectionTimeOut=Connection time out: {0}
- contextMustBeNonNegative=context must be >= 0
- corruptObjectBadStream=bad stream
- corruptObjectBadStreamCorruptHeader=bad stream, corrupt header
- corruptObjectGarbageAfterSize=garbage after size
- corruptObjectIncorrectLength=incorrect length
- corruptObjectInvalidEntryMode=invalid entry mode
- corruptObjectInvalidMode2=invalid mode {0}
- corruptObjectInvalidMode3=invalid mode {0} for {1} '{2}' in {3}.
- corruptObjectInvalidMode=invalid mode
- corruptObjectInvalidType2=invalid type {0}
- corruptObjectInvalidType=invalid type
- corruptObjectMalformedHeader=malformed header: {0}
- corruptObjectNegativeSize=negative size
- corruptObjectNoAuthor=no author
- corruptObjectNoCommitter=no committer
- corruptObjectNoHeader=no header
- corruptObjectNoObject=no object
- corruptObjectNoTagName=no tag name
- corruptObjectNoTaggerBadHeader=no tagger/bad header
- corruptObjectNoTaggerHeader=no tagger header
- corruptObjectNoType=no type
- corruptObjectNotree=no tree
- corruptObjectPackfileChecksumIncorrect=Packfile checksum incorrect.
- corruptionDetectedReReadingAt=Corruption detected re-reading at {0}
- couldNotCheckOutBecauseOfConflicts=Could not check out because of conflicts
- couldNotDeleteLockFileShouldNotHappen=Could not delete lock file. Should not happen
- couldNotDeleteTemporaryIndexFileShouldNotHappen=Could not delete temporary index file. Should not happen
- couldNotGetAdvertisedRef=Could not get advertised Ref for branch {0}
- couldNotLockHEAD=Could not lock HEAD
- couldNotReadIndexInOneGo=Could not read index in one go, only {0} out of {1} read
- couldNotReadObjectWhileParsingCommit=Could not read an object while parsing commit {0}
- couldNotRenameDeleteOldIndex=Could not rename delete old index
- couldNotRenameTemporaryFile=Could not rename temporary file {0} to new location {1}
- couldNotRenameTemporaryIndexFileToIndex=Could not rename temporary index file to index
- couldNotURLEncodeToUTF8=Could not URL encode to UTF-8
- couldNotWriteFile=Could not write file {0}
- countingObjects=Counting objects
- createBranchFailedUnknownReason=Create branch failed for unknown reason
- createBranchUnexpectedResult=Create branch returned unexpected result {0}
- createNewFileFailed=Could not create new file {0}
- credentialPassword=Password
- credentialUsername=Username
- daemonAlreadyRunning=Daemon already running
- deleteBranchUnexpectedResult=Delete branch returned unexpected result {0}
- deleteFileFailed=Could not delete file {0}
- deletingNotSupported=Deleting {0} not supported.
- destinationIsNotAWildcard=Destination is not a wildcard.
- detachedHeadDetected=HEAD is detached
- dirCacheDoesNotHaveABackingFile=DirCache does not have a backing file
- dirCacheFileIsNotLocked=DirCache {0} not locked
- dirCacheIsNotLocked=DirCache is not locked
- dirtyFilesExist=Dirty files exist. Refusing to merge
- doesNotHandleMode=Does not handle mode {0} ({1})
- downloadCancelled=Download cancelled
- downloadCancelledDuringIndexing=Download cancelled during indexing
- duplicateAdvertisementsOf=duplicate advertisements of {0}
- duplicateRef=Duplicate ref: {0}
- duplicateRemoteRefUpdateIsIllegal=Duplicate remote ref update is illegal. Affected remote name: {0}
- duplicateStagesNotAllowed=Duplicate stages not allowed
- eitherGitDirOrWorkTreeRequired=One of setGitDir or setWorkTree must be called.
- emptyCommit=No changes
- emptyPathNotPermitted=Empty path not permitted.
- encryptionError=Encryption error: {0}
- endOfFileInEscape=End of file in escape
- entryNotFoundByPath=Entry not found by path: {0}
- enumValueNotSupported2=Invalid value: {0}.{1}={2}
- enumValueNotSupported3=Invalid value: {0}.{1}.{2}={3}
- enumValuesNotAvailable=Enumerated values of type {0} not available
- errorDecodingFromFile=Error decoding from file {0}
- errorEncodingFromFile=Error encoding from file {0}
- errorInBase64CodeReadingStream=Error in Base64 code reading stream.
- errorInPackedRefs=error in packed-refs
- errorInvalidProtocolWantedOldNewRef=error: invalid protocol: wanted 'old new ref'
- errorListing=Error listing {0}
- errorOccurredDuringUnpackingOnTheRemoteEnd=error occurred during unpacking on the remote end: {0}
- errorReadingInfoRefs=error reading info/refs
- exceptionCaughtDuringExecutionOfAddCommand=Exception caught during execution of add command
- exceptionCaughtDuringExecutionOfCherryPickCommand=Exception caught during execution of cherry-pick command. {0}
- exceptionCaughtDuringExecutionOfCommitCommand=Exception caught during execution of commit command
- exceptionCaughtDuringExecutionOfFetchCommand=Exception caught during execution of fetch command
- exceptionCaughtDuringExecutionOfMergeCommand=Exception caught during execution of merge command. {0}
- exceptionCaughtDuringExecutionOfPushCommand=Exception caught during execution of push command
- exceptionCaughtDuringExecutionOfPullCommand=Exception caught during execution of pull command
- exceptionCaughtDuringExecutionOfResetCommand=Exception caught during execution of reset command. {0}
- exceptionCaughtDuringExecutionOfRevertCommand=Exception caught during execution of revert command. {0}
- exceptionCaughtDuringExecutionOfRmCommand=Exception caught during execution of rm command
- exceptionCaughtDuringExecutionOfTagCommand=Exception caught during execution of tag command
- exceptionOccuredDuringAddingOfOptionToALogCommand=Exception occured during adding of {0} as option to a Log command
- exceptionOccuredDuringReadingOfGIT_DIR=Exception occured during reading of $GIT_DIR/{0}. {1}
- expectedACKNAKFoundEOF=Expected ACK/NAK, found EOF
- expectedACKNAKGot=Expected ACK/NAK, got: {0}
- expectedBooleanStringValue=Expected boolean string value
- expectedCharacterEncodingGuesses=Expected {0} character encoding guesses
- expectedEOFReceived=expected EOF; received '{0}' instead
- expectedGot=expected '{0}', got '{1}'
- expectedPktLineWithService=expected pkt-line with '# service=-', got '{0}'
- expectedReceivedContentType=expected Content-Type {0}; received Content-Type {1}
- expectedReportForRefNotReceived={0}: expected report for ref {1} not received
- failedUpdatingRefs=failed updating refs
- failureDueToOneOfTheFollowing=Failure due to one of the following:
- failureUpdatingFETCH_HEAD=Failure updating FETCH_HEAD: {0}
- failureUpdatingTrackingRef=Failure updating tracking ref {0}: {1}
- fileCannotBeDeleted=File cannot be deleted: {0}
- fileIsTooBigForThisConvenienceMethod=File is too big for this convenience method ({0} bytes).
- fileIsTooLarge=File is too large: {0}
- fileModeNotSetForPath=FileMode not set for path {0}
- flagIsDisposed={0} is disposed.
- flagNotFromThis={0} not from this.
- flagsAlreadyCreated={0} flags already created.
- funnyRefname=funny refname
- hugeIndexesAreNotSupportedByJgitYet=Huge indexes are not supported by jgit, yet
- hunkBelongsToAnotherFile=Hunk belongs to another file
- hunkDisconnectedFromFile=Hunk disconnected from file
- hunkHeaderDoesNotMatchBodyLineCountOf=Hunk header {0} does not match body line count of {1}
- illegalArgumentNotA=Not {0}
- illegalCombinationOfArguments=The combination of arguments {0} and {1} is not allowed
- illegalStateExists=exists {0}
- improperlyPaddedBase64Input=Improperly padded Base64 input.
- inMemoryBufferLimitExceeded=In-memory buffer limit exceeded
- incorrectHashFor=Incorrect hash for {0}; computed {1} as a {2} from {3} bytes.
- incorrectOBJECT_ID_LENGTH=Incorrect OBJECT_ID_LENGTH.
- incorrectObjectType_COMMITnorTREEnorBLOBnorTAG=COMMIT nor TREE nor BLOB nor TAG
- indexFileIsInUse=Index file is in use
- indexFileIsTooLargeForJgit=Index file is too large for jgit
- indexSignatureIsInvalid=Index signature is invalid: {0}
- indexWriteException=Modified index could not be written
- integerValueOutOfRange=Integer value {0}.{1} out of range
- internalRevisionError=internal revision error
- interruptedWriting=Interrupted writing {0}
- invalidAdvertisementOf=invalid advertisement of {0}
- invalidAncestryLength=Invalid ancestry length
- invalidBooleanValue=Invalid boolean value: {0}.{1}={2}
- invalidChannel=Invalid channel {0}
- invalidCharacterInBase64Data=Invalid character in Base64 data.
- invalidCommitParentNumber=Invalid commit parent number
- invalidEncryption=Invalid encryption
- invalidGitType=invalid git type: {0}
- invalidId=Invalid id {0}
- invalidIdLength=Invalid id length {0}; should be {1}
- invalidIntegerValue=Invalid integer value: {0}.{1}={2}
- invalidKey=Invalid key: {0}
- invalidLineInConfigFile=Invalid line in config file
- invalidModeFor=Invalid mode {0} for {1} {2} in {3}.
- invalidModeForPath=Invalid mode {0} for path {1}
- invalidObject=Invalid {0} {1}:{2}
- invalidOldIdSent=invalid old id sent
- invalidPacketLineHeader=Invalid packet line header: {0}
- invalidPath=Invalid path: {0}
- invalidRefName=Invalid ref name: {0}
- invalidRemote=Invalid remote: {0}
- invalidStageForPath=Invalid stage {0} for path {1}
- invalidTagOption=Invalid tag option: {0}
- invalidTimeout=Invalid timeout: {0}
- invalidURL=Invalid URL {0}
- invalidWildcards=Invalid wildcards {0}
- invalidWindowSize=Invalid window size
- isAStaticFlagAndHasNorevWalkInstance={0} is a static flag and has no RevWalk instance
- kNotInRange=k {0} not in {1} - {2}
- largeObjectException={0} exceeds size limit
- largeObjectOutOfMemory=Out of memory loading {0}
- largeObjectExceedsByteArray=Object {0} exceeds 2 GiB byte array limit
- largeObjectExceedsLimit=Object {0} exceeds {1} limit, actual size is {2}
- lengthExceedsMaximumArraySize=Length exceeds maximum array size
- listingAlternates=Listing alternates
- localObjectsIncomplete=Local objects incomplete.
- localRefIsMissingObjects=Local ref {0} is missing object(s).
- lockCountMustBeGreaterOrEqual1=lockCount must be >= 1
- lockError=lock error: {0}
- lockOnNotClosed=Lock on {0} not closed.
- lockOnNotHeld=Lock on {0} not held.
- malformedpersonIdentString=Malformed PersonIdent string (no < was found): {0}
- mergeConflictOnNotes=Merge conflict on note {0}. base = {1}, ours = {2}, theirs = {2}
- mergeConflictOnNonNoteEntries=Merge conflict on non-note entries: base = {0}, ours = {1}, theirs = {2}
- mergeStrategyAlreadyExistsAsDefault=Merge strategy "{0}" already exists as a default strategy
- mergeStrategyDoesNotSupportHeads=merge strategy {0} does not support {1} heads to be merged into HEAD
- mergeUsingStrategyResultedInDescription=Merge of revisions {0} with base {1} using strategy {2} resulted in: {3}. {4}
- missingAccesskey=Missing accesskey.
- missingConfigurationForKey=No value for key {0} found in configuration
- missingDeltaBase=delta base
- missingForwardImageInGITBinaryPatch=Missing forward-image in GIT binary patch
- missingObject=Missing {0} {1}
- missingPrerequisiteCommits=missing prerequisite commits:
- missingRequiredParameter=Parameter "{0}" is missing
- missingSecretkey=Missing secretkey.
- mixedStagesNotAllowed=Mixed stages not allowed
- mkDirFailed=Creating directory {0} failed
- mkDirsFailed=Creating directories for {0} failed
- multipleMergeBasesFor=Multiple merge bases for:\n {0}\n {1} found:\n {2}\n {3}
- need2Arguments=Need 2 arguments
- needPackOut=need packOut
- needsAtLeastOneEntry=Needs at least one entry
- needsWorkdir=Needs workdir
- newlineInQuotesNotAllowed=Newline in quotes not allowed
- noApplyInDelete=No apply in delete
- noClosingBracket=No closing {0} found for {1} at index {2}.
- noHEADExistsAndNoExplicitStartingRevisionWasSpecified=No HEAD exists and no explicit starting revision was specified
- noHMACsupport=No {0} support: {1}
- noMergeHeadSpecified=No merge head specified
- noSuchRef=no such ref
- noXMLParserAvailable=No XML parser available.
- notABoolean=Not a boolean: {0}
- notABundle=not a bundle
- notADIRCFile=Not a DIRC file.
- notAGitDirectory=not a git directory
- notAPACKFile=Not a PACK file.
- notARef=Not a ref: {0}: {1}
- notASCIIString=Not ASCII string: {0}
- notAuthorized=not authorized
- notAValidPack=Not a valid pack {0}
- notFound=not found.
- notValid={0} not valid
- nothingToFetch=Nothing to fetch.
- nothingToPush=Nothing to push.
- notMergedExceptionMessage=Branch was not deleted as it has not been merged yet; use the force option to delete it anyway
- objectAtHasBadZlibStream=Object at {0} in {1} has bad zlib stream
- objectAtPathDoesNotHaveId=Object at path "{0}" does not have an id assigned. All object ids must be assigned prior to writing a tree.
- objectIsCorrupt=Object {0} is corrupt: {1}
- objectIsNotA=Object {0} is not a {1}.
- objectNotFoundIn=Object {0} not found in {1}.
- obtainingCommitsForCherryPick=Obtaining commits that need to be cherry-picked
- offsetWrittenDeltaBaseForObjectNotFoundInAPack=Offset-written delta base for object not found in a pack
- onlyAlreadyUpToDateAndFastForwardMergesAreAvailable=only already-up-to-date and fast forward merges are available
- onlyOneFetchSupported=Only one fetch supported
- onlyOneOperationCallPerConnectionIsSupported=Only one operation call per connection is supported.
- openFilesMustBeAtLeast1=Open files must be >= 1
- openingConnection=Opening connection
- operationCanceled=Operation {0} was canceled
- outputHasAlreadyBeenStarted=Output has already been started.
- packChecksumMismatch=Pack checksum mismatch
- packCorruptedWhileWritingToFilesystem=Pack corrupted while writing to filesystem
- packDoesNotMatchIndex=Pack {0} does not match index
- packFileInvalid=Pack file invalid: {0}
- packHasUnresolvedDeltas=pack has unresolved deltas
- packObjectCountMismatch=Pack object count mismatch: pack {0} index {1}: {2}
- packTooLargeForIndexVersion1=Pack too large for index version 1
- packetSizeMustBeAtLeast=packet size {0} must be >= {1}
- packetSizeMustBeAtMost=packet size {0} must be <= {1}
- packfileCorruptionDetected=Packfile corruption detected: {0}
- packfileIsTruncated=Packfile is truncated.
- packingCancelledDuringObjectsWriting=Packing cancelled during objects writing
- packWriterStatistics=Total {0,number,#0} (delta {1,number,#0}), reused {2,number,#0} (delta {3,number,#0})
- pathIsNotInWorkingDir=Path is not in working dir
- peeledLineBeforeRef=Peeled line before ref.
- peerDidNotSupplyACompleteObjectGraph=peer did not supply a complete object graph
- prefixRemote=remote:
- problemWithResolvingPushRefSpecsLocally=Problem with resolving push ref specs locally: {0}
- progressMonUploading=Uploading {0}
- propertyIsAlreadyNonNull=Property is already non null
- pullTaskName=Pull
- pushCancelled=push cancelled
- pushIsNotSupportedForBundleTransport=Push is not supported for bundle transport
- pushNotPermitted=push not permitted
- rawLogMessageDoesNotParseAsLogEntry=Raw log message does not parse as log entry
- readTimedOut=Read timed out
- readingObjectsFromLocalRepositoryFailed=reading objects from local repository failed: {0}
- receivingObjects=Receiving objects
- refAlreadExists=Ref {0} already exists
- refNotResolved=Ref {0} can not be resolved
- refUpdateReturnCodeWas=RefUpdate return code was: {0}
- reflogsNotYetSupportedByRevisionParser=reflogs not yet supported by revision parser
- remoteConfigHasNoURIAssociated=Remote config "{0}" has no URIs associated
- remoteDoesNotHaveSpec=Remote does not have {0} available for fetch.
- remoteDoesNotSupportSmartHTTPPush=remote does not support smart HTTP push
- remoteHungUpUnexpectedly=remote hung up unexpectedly
- remoteNameCantBeNull=Remote name can't be null.
- renameBranchFailedBecauseTag=Can not rename as Ref {0} is a tag
- renameBranchFailedUnknownReason=Rename failed with unknown reason
- renameBranchUnexpectedResult=Unexpected rename result {0}
- renamesAlreadyFound=Renames have already been found.
- renamesBreakingModifies=Breaking apart modified file pairs
- renamesFindingByContent=Finding renames by content similarity
- renamesFindingExact=Finding exact renames
- renamesRejoiningModifies=Rejoining modified file pairs
- repositoryAlreadyExists=Repository already exists: {0}
- repositoryConfigFileInvalid=Repository config file {0} invalid {1}
- repositoryIsRequired=Repository is required.
- repositoryNotFound=repository not found: {0}
- repositoryState_applyMailbox=Apply mailbox
- repositoryState_bisecting=Bisecting
- repositoryState_conflicts=Conflicts
- repositoryState_merged=Merged
- repositoryState_normal=Normal
- repositoryState_rebase=Rebase
- repositoryState_rebaseInteractive=Rebase interactive
- repositoryState_rebaseOrApplyMailbox=Rebase/Apply mailbox
- repositoryState_rebaseWithMerge=Rebase w/merge
- requiredHashFunctionNotAvailable=Required hash function {0} not available.
- resolvingDeltas=Resolving deltas
- resettingHead=Resetting head to {0}
- resultLengthIncorrect=result length incorrect
- rewinding=Rewinding to commit {0}
- searchForReuse=Finding sources
- searchForSizes=Getting sizes
- sequenceTooLargeForDiffAlgorithm=Sequence too large for difference algorithm.
- serviceNotEnabledNoName=Service not enabled
- serviceNotPermitted={0} not permitted
- serviceNotPermittedNoName=Service not permitted
- shortCompressedStreamAt=Short compressed stream at {0}
- shortReadOfBlock=Short read of block.
- shortReadOfOptionalDIRCExtensionExpectedAnotherBytes=Short read of optional DIRC extension {0}; expected another {1} bytes within the section.
- shortSkipOfBlock=Short skip of block.
- signingNotSupportedOnTag=Signing isn't supported on tag operations yet.
- similarityScoreMustBeWithinBounds=Similarity score must be between 0 and 100.
- sizeExceeds2GB=Path {0} size {1} exceeds 2 GiB limit.
- smartHTTPPushDisabled=smart HTTP push disabled
- sourceDestinationMustMatch=Source/Destination must match.
- sourceIsNotAWildcard=Source is not a wildcard.
- sourceRefDoesntResolveToAnyObject=Source ref {0} doesn't resolve to any object.
- sourceRefNotSpecifiedForRefspec=Source ref not specified for refspec: {0}
- staleRevFlagsOn=Stale RevFlags on {0}
- startingReadStageWithoutWrittenRequestDataPendingIsNotSupported=Starting read stage without written request data pending is not supported
- statelessRPCRequiresOptionToBeEnabled=stateless RPC requires {0} to be enabled
- submodulesNotSupported=Submodules are not supported
- symlinkCannotBeWrittenAsTheLinkTarget=Symlink "{0}" cannot be written as the link target cannot be read from within Java.
- systemConfigFileInvalid=Systen wide config file {0} is invalid {1}
- tagNameInvalid=tag name {0} is invalid
- tagOnRepoWithoutHEADCurrentlyNotSupported=Tag on repository without HEAD currently not supported
- tSizeMustBeGreaterOrEqual1=tSize must be >= 1
- theFactoryMustNotBeNull=The factory must not be null
- timerAlreadyTerminated=Timer already terminated
- topologicalSortRequired=Topological sort required.
- transportExceptionBadRef=Empty ref: {0}: {1}
- transportExceptionEmptyRef=Empty ref: {0}
- transportExceptionInvalid=Invalid {0} {1}:{2}
- transportExceptionMissingAssumed=Missing assumed {0}
- transportExceptionReadRef=read {0}
- treeEntryAlreadyExists=Tree entry "{0}" already exists.
- treeIteratorDoesNotSupportRemove=TreeIterator does not support remove()
- truncatedHunkLinesMissingForAncestor=Truncated hunk, at least {0} lines missing for ancestor {1}
- truncatedHunkNewLinesMissing=Truncated hunk, at least {0} new lines is missing
- truncatedHunkOldLinesMissing=Truncated hunk, at least {0} old lines is missing
- unableToCheckConnectivity=Unable to check connectivity.
- unableToStore=Unable to store {0}.
- unableToWrite=Unable to write {0}
- unencodeableFile=Unencodeable file: {0}
- unexpectedCompareResult=Unexpected metadata comparison result: {0}
- unexpectedEndOfConfigFile=Unexpected end of config file
- unexpectedHunkTrailer=Unexpected hunk trailer
- unexpectedOddResult=odd: {0} + {1} - {2}
- unexpectedRefReport={0}: unexpected ref report: {1}
- unexpectedReportLine2={0} unexpected report line: {1}
- unexpectedReportLine=unexpected report line: {0}
- unknownDIRCVersion=Unknown DIRC version {0}
- unknownHost=unknown host
- unknownIndexVersionOrCorruptIndex=Unknown index version (or corrupt index): {0}
- unknownObject=unknown object
- unknownObjectType=Unknown object type {0}.
- unknownRepositoryFormat2=Unknown repository format "{0}"; expected "0".
- unknownRepositoryFormat=Unknown repository format
- unknownZlibError=Unknown zlib error.
- unpackException=Exception while parsing pack stream
- unmergedPath=Unmerged path: {0}
- unmergedPaths=Repository contains unmerged paths
- unreadablePackIndex=Unreadable pack index: {0}
- unrecognizedRef=Unrecognized ref: {0}
- unsupportedCommand0=unsupported command 0
- unsupportedEncryptionAlgorithm=Unsupported encryption algorithm: {0}
- unsupportedEncryptionVersion=Unsupported encryption version: {0}
- unsupportedOperationNotAddAtEnd=Not add-at-end: {0}
- unsupportedPackIndexVersion=Unsupported pack index version {0}
- unsupportedPackVersion=Unsupported pack version {0}.
- updatingRefFailed=Updating the ref {0} to {1} failed. ReturnCode from RefUpdate.update() was {2}
- uriNotFound={0} not found
- userConfigFileInvalid=User config file {0} invalid {1}
- walkFailure=Walk failure.
- windowSizeMustBeLesserThanLimit=Window size must be < limit
- windowSizeMustBePowerOf2=Window size must be power of 2
- writeTimedOut=Write timed out
- writerAlreadyInitialized=Writer already initialized
- writingNotPermitted=Writing not permitted
- writingNotSupported=Writing {0} not supported.
- writingObjects=Writing objects
- wrongDecompressedLength=wrong decompressed length
- wrongRepositoryState=Wrong Repository State: {0}
|