You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

JGitText.properties 25KB

PackWriter: Support reuse of entire packs The most expensive part of packing a repository for transport to another system is enumerating all of the objects in the repository. Once this gets to the size of the linux-2.6 repository (1.8 million objects), enumeration can take several CPU minutes and costs a lot of temporary working set memory. Teach PackWriter to efficiently reuse an existing "cached pack" by answering a clone request with a thin pack followed by a larger cached pack appended to the end. This requires the repository owner to first construct the cached pack by hand, and record the tip commits inside of $GIT_DIR/objects/info/cached-packs: cd $GIT_DIR root=$(git rev-parse master) tmp=objects/.tmp-$$ names=$(echo $root | git pack-objects --keep-true-parents --revs $tmp) for n in $names; do chmod a-w $tmp-$n.pack $tmp-$n.idx touch objects/pack/pack-$n.keep mv $tmp-$n.pack objects/pack/pack-$n.pack mv $tmp-$n.idx objects/pack/pack-$n.idx done (echo "+ $root"; for n in $names; do echo "P $n"; done; echo) >>objects/info/cached-packs git repack -a -d When a clone request needs to include $root, the corresponding cached pack will be copied as-is, rather than enumerating all of the objects that are reachable from $root. For a linux-2.6 kernel repository that should be about 376 MiB, the above process creates two packs of 368 MiB and 38 MiB[1]. This is a local disk usage increase of ~26 MiB, due to reduced delta compression between the large cached pack and the smaller recent activity pack. The overhead is similar to 1 full copy of the compressed project sources. With this cached pack in hand, JGit daemon completes a clone request in 1m17s less time, but a slightly larger data transfer (+2.39 MiB): Before: remote: Counting objects: 1861830, done remote: Finding sources: 100% (1861830/1861830) remote: Getting sizes: 100% (88243/88243) remote: Compressing objects: 100% (88184/88184) Receiving objects: 100% (1861830/1861830), 376.01 MiB | 19.01 MiB/s, done. remote: Total 1861830 (delta 4706), reused 1851053 (delta 1553844) Resolving deltas: 100% (1564621/1564621), done. real 3m19.005s After: remote: Counting objects: 1601, done remote: Counting objects: 1828460, done remote: Finding sources: 100% (50475/50475) remote: Getting sizes: 100% (18843/18843) remote: Compressing objects: 100% (7585/7585) remote: Total 1861830 (delta 2407), reused 1856197 (delta 37510) Receiving objects: 100% (1861830/1861830), 378.40 MiB | 31.31 MiB/s, done. Resolving deltas: 100% (1559477/1559477), done. real 2m2.938s Repository owners can periodically refresh their cached packs by repacking their repository, folding all newer objects into a larger cached pack. Since repacking is already considered to be a normal Git maintenance activity, this isn't a very big burden. [1] In this test $root was set back about two weeks. Change-Id: Ib87131d5c4b5e8c5cacb0f4fe16ff4ece554734b Signed-off-by: Shawn O. Pearce <spearce@spearce.org>
13 years ago
Merging Git notes Merging Git notes branches has several differences from merging "normal" branches. Although Git notes are initially stored as one flat tree the tree may fanout when the number of notes becomes too large for efficient access. In this case the first two hex digits of the note name will be used as a subdirectory name and the rest 38 hex digits as the file name under that directory. Similarly, when number of notes decreases a fanout tree may collapse back into a flat tree. The Git notes merge algorithm must take into account possibly different tree structures in different note branches and must properly match them against each other. Any conflict on a Git note is, by default, resolved by concatenating the two conflicting versions of the note. A delete-edit conflict is, by default, resolved by keeping the edit version. The note merge logic is pluggable and the caller may provide custom note merger that will perform different merging strategy. Additionally, it is possible to have non-note entries inside a notes tree. The merge algorithm must also take this fact into account and will try to merge such non-note entries. However, in case of any merge conflicts the merge operation will fail. Git notes merge algorithm is currently not trying to do content merge of non-note entries. Thanks to Shawn Pearce for patiently answering my questions related to this topic, giving hints and providing code snippets. Change-Id: I3b2335c76c766fd7ea25752e54087f9b19d69c88 Signed-off-by: Sasa Zivkov <sasa.zivkov@sap.com> Signed-off-by: Matthias Sohn <matthias.sohn@sap.com>
13 years ago
Implement similarity based rename detection Content similarity based rename detection is performed only after a linear time detection is performed using exact content match on the ObjectIds. Any names which were paired up during that exact match phase are excluded from the inexact similarity based rename, which reduces the space that must be considered. During rename detection two entries cannot be marked as a rename if they are different types of files. This prevents a symlink from being renamed to a regular file, even if their blob content appears to be similar, or is identical. Efficiently comparing two files is performed by building up two hash indexes and hashing lines or short blocks from each file, counting the number of bytes that each line or block represents. Instead of using a standard java.util.HashMap, we use a custom open hashing scheme similiar to what we use in ObjecIdSubclassMap. This permits us to have a very light-weight hash, with very little memory overhead per cell stored. As we only need two ints per record in the map (line/block key and number of bytes), we collapse them into a single long inside of a long array, making very efficient use of available memory when we create the index table. We only need object headers for the index structure itself, and the index table, but not per-cell. This offers a massive space savings over using java.util.HashMap. The score calculation is done by approximating how many bytes are the same between the two inputs (which for a delta would be how much is copied from the base into the result). The score is derived by dividing the approximate number of bytes in common into the length of the larger of the two input files. Right now the SimilarityIndex table should average about 1/2 full, which means we waste about 50% of our memory on empty entries after we are done indexing a file and sort the table's contents. If memory becomes an issue we could discard the table and copy all records over to a new array that is properly sized. Building the index requires O(M + N log N) time, where M is the size of the input file in bytes, and N is the number of unique lines/blocks in the file. The N log N time constraint comes from the sort of the index table that is necessary to perform linear time matching against another SimilarityIndex created for a different file. To actually perform the rename detection, a SxD matrix is created, placing the sources (aka deletions) along one dimension and the destinations (aka additions) along the other. A simple O(S x D) loop examines every cell in this matrix. A SimilarityIndex is built along the row and reused for each column compare along that row, avoiding the costly index rebuild at the row level. A future improvement would be to load a smaller square matrix into SimilarityIndexes and process everything in that sub-matrix before discarding the column dimension and moving down to the next sub-matrix block along that same grid of rows. An optional ProgressMonitor is permitted to be passed in, allowing applications to see the progress of the detector as it works through the matrix cells. This provides some indication of current status for very long running renames. The default line/block hash function used by the SimilarityIndex may not be optimal, and may produce too many collisions. It is borrowed from RawText's hash, which is used to quickly skip out of a longer equality test if two lines have different hash functions. We may need to refine this hash in the future, in order to minimize the number of collisions we get on common source files. Based on a handful of test commits in JGit (especially my own recent rename repository refactoring series), this rename detector produces output that is very close to C Git. The content similarity scores are sometimes off by 1%, which is most probably caused by our SimilarityIndex type using a different hash function than C Git uses when it computes the delta size between any two objects in the rename matrix. Bug: 318504 Change-Id: I11dff969e8a2e4cf252636d857d2113053bdd9dc Signed-off-by: Shawn O. Pearce <spearce@spearce.org>
14 years ago
Increase core.streamFileThreshold default to 50 MiB Projects like org.eclipse.mdt contain large XML files about 6 MiB in size. So does the Android project platform/frameworks/base. Doing a clone of either project with JGit takes forever to checkout the files into the working directory, because delta decompression tends to be very expensive as we need to constantly reposition the base stream for each copy instruction. This can be made worse by a very bad ordering of offsets, possibly due to an XML editor that doesn't preserve the order of elements in the file very well. Increasing the threshold to the same limit PackWriter uses when doing delta compression (50 MiB) permits a default configured JGit to decompress these XML file objects using the faster random-access arrays, rather than re-seeking through an inflate stream, significantly reducing checkout time after a clone. Since this new limit may be dangerously close to the JVM maximum heap size, every allocation attempt is now wrapped in a try/catch so that JGit can degrade by switching to the large object stream mode when the allocation is refused. It will run slower, but the operation will still complete. The large stream mode will run very well for big objects that aren't delta compressed, and is acceptable for delta compressed objects that are using only forward referencing copy instructions. Copies using prior offsets are still going to be horrible, and there is nothing we can do about it except increase core.streamFileThreshold. We might in the future want to consider changing the way the delta generators work in JGit and native C Git to avoid prior offsets once an object reaches a certain size, even if that causes the delta instruction stream to be slightly larger. Unfortunately native C Git won't want to do that until its also able to stream objects rather than malloc them as contiguous blocks. Change-Id: Ief7a3896afce15073e80d3691bed90c6a3897307 Signed-off-by: Shawn O. Pearce <spearce@spearce.org> Signed-off-by: Chris Aniszczyk <caniszczyk@gmail.com>
13 years ago
Implement similarity based rename detection Content similarity based rename detection is performed only after a linear time detection is performed using exact content match on the ObjectIds. Any names which were paired up during that exact match phase are excluded from the inexact similarity based rename, which reduces the space that must be considered. During rename detection two entries cannot be marked as a rename if they are different types of files. This prevents a symlink from being renamed to a regular file, even if their blob content appears to be similar, or is identical. Efficiently comparing two files is performed by building up two hash indexes and hashing lines or short blocks from each file, counting the number of bytes that each line or block represents. Instead of using a standard java.util.HashMap, we use a custom open hashing scheme similiar to what we use in ObjecIdSubclassMap. This permits us to have a very light-weight hash, with very little memory overhead per cell stored. As we only need two ints per record in the map (line/block key and number of bytes), we collapse them into a single long inside of a long array, making very efficient use of available memory when we create the index table. We only need object headers for the index structure itself, and the index table, but not per-cell. This offers a massive space savings over using java.util.HashMap. The score calculation is done by approximating how many bytes are the same between the two inputs (which for a delta would be how much is copied from the base into the result). The score is derived by dividing the approximate number of bytes in common into the length of the larger of the two input files. Right now the SimilarityIndex table should average about 1/2 full, which means we waste about 50% of our memory on empty entries after we are done indexing a file and sort the table's contents. If memory becomes an issue we could discard the table and copy all records over to a new array that is properly sized. Building the index requires O(M + N log N) time, where M is the size of the input file in bytes, and N is the number of unique lines/blocks in the file. The N log N time constraint comes from the sort of the index table that is necessary to perform linear time matching against another SimilarityIndex created for a different file. To actually perform the rename detection, a SxD matrix is created, placing the sources (aka deletions) along one dimension and the destinations (aka additions) along the other. A simple O(S x D) loop examines every cell in this matrix. A SimilarityIndex is built along the row and reused for each column compare along that row, avoiding the costly index rebuild at the row level. A future improvement would be to load a smaller square matrix into SimilarityIndexes and process everything in that sub-matrix before discarding the column dimension and moving down to the next sub-matrix block along that same grid of rows. An optional ProgressMonitor is permitted to be passed in, allowing applications to see the progress of the detector as it works through the matrix cells. This provides some indication of current status for very long running renames. The default line/block hash function used by the SimilarityIndex may not be optimal, and may produce too many collisions. It is borrowed from RawText's hash, which is used to quickly skip out of a longer equality test if two lines have different hash functions. We may need to refine this hash in the future, in order to minimize the number of collisions we get on common source files. Based on a handful of test commits in JGit (especially my own recent rename repository refactoring series), this rename detector produces output that is very close to C Git. The content similarity scores are sometimes off by 1%, which is most probably caused by our SimilarityIndex type using a different hash function than C Git uses when it computes the delta size between any two objects in the rename matrix. Bug: 318504 Change-Id: I11dff969e8a2e4cf252636d857d2113053bdd9dc Signed-off-by: Shawn O. Pearce <spearce@spearce.org>
14 years ago
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464
  1. DIRCChecksumMismatch=DIRC checksum mismatch
  2. DIRCExtensionIsTooLargeAt=DIRC extension {0} is too large at {1} bytes.
  3. DIRCExtensionNotSupportedByThisVersion=DIRC extension {0} not supported by this version.
  4. DIRCHasTooManyEntries=DIRC has too many entries.
  5. DIRCUnrecognizedExtendedFlags=Unrecognized extended flags: {0}
  6. JRELacksMD5Implementation=JRE lacks MD5 implementation
  7. URINotSupported=URI not supported: {0}
  8. URLNotFound={0} not found
  9. aNewObjectIdIsRequired=A NewObjectId is required.
  10. abbreviationLengthMustBeNonNegative=Abbreviation length must not be negative.
  11. abortingRebase=Aborting rebase: resetting to {0}
  12. abortingRebaseFailed=Could not abort rebase
  13. advertisementCameBefore=advertisement of {0}^{} came before {1}
  14. advertisementOfCameBefore=advertisement of {0}^{} came before {1}
  15. amazonS3ActionFailed={0} of '{1}' failed: {2} {3}
  16. amazonS3ActionFailedGivingUp={0} of '{1}' failed: Giving up after {2} attempts.
  17. ambiguousObjectAbbreviation=Object abbreviation {0} is ambiguous
  18. anExceptionOccurredWhileTryingToAddTheIdOfHEAD=An exception occurred while trying to add the Id of HEAD
  19. anSSHSessionHasBeenAlreadyCreated=An SSH session has been already created
  20. applyingCommit=Applying {0}
  21. atLeastOnePathIsRequired=At least one path is required.
  22. atLeastOnePatternIsRequired=At least one pattern is required.
  23. atLeastTwoFiltersNeeded=At least two filters needed.
  24. authenticationNotSupported=authentication not supported
  25. badBase64InputCharacterAt=Bad Base64 input character at {0} : {1} (decimal)
  26. badEntryDelimiter=Bad entry delimiter
  27. badEntryName=Bad entry name: {0}
  28. badEscape=Bad escape: {0}
  29. badGroupHeader=Bad group header
  30. badObjectType=Bad object type: {0}
  31. badSectionEntry=Bad section entry: {0}
  32. base64InputNotProperlyPadded=Base64 input not properly padded.
  33. baseLengthIncorrect=base length incorrect
  34. bareRepositoryNoWorkdirAndIndex=Bare Repository has neither a working tree, nor an index
  35. blobNotFound=Blob not found: {0}
  36. branchNameInvalid=Branch name {0} is not allowed
  37. blobNotFoundForPath=Blob not found: {0} for path: {1}
  38. cachedPacksPreventsIndexCreation=Using cached packs prevents index creation
  39. cannotBeCombined=Cannot be combined.
  40. cannotCombineTreeFilterWithRevFilter=Cannot combine TreeFilter {0} with RefFilter {1}.
  41. cannotCommitOnARepoWithState=Cannot commit on a repo with state: {0}
  42. cannotCommitWriteTo=Cannot commit write to {0}
  43. cannotConnectPipes=cannot connect pipes
  44. cannotConvertScriptToText=Cannot convert script to text
  45. cannotCreateConfig=cannot create config
  46. cannotCreateDirectory=Cannot create directory {0}
  47. cannotCreateHEAD=cannot create HEAD
  48. cannotDeleteCheckedOutBranch=Branch {0} is checked out and can not be deleted
  49. cannotDeleteFile=Cannot delete file: {0}
  50. cannotDeleteStaleTrackingRef2=Cannot delete stale tracking ref {0}: {1}
  51. cannotDeleteStaleTrackingRef=Cannot delete stale tracking ref {0}
  52. cannotDetermineProxyFor=Cannot determine proxy for {0}
  53. cannotDownload=Cannot download {0}
  54. cannotExecute=cannot execute: {0}
  55. cannotGet=Cannot get {0}
  56. cannotListRefs=cannot list refs
  57. cannotLock=Cannot lock {0}
  58. cannotLockFile=Cannot lock file {0}
  59. cannotLockPackIn=Cannot lock pack in {0}
  60. cannotMatchOnEmptyString=Cannot match on empty string.
  61. cannotMoveIndexTo=Cannot move index to {0}
  62. cannotMovePackTo=Cannot move pack to {0}
  63. cannotOpenService=cannot open {0}
  64. cannotParseGitURIish=Cannot parse Git URI-ish
  65. cannotPullOnARepoWithState=Cannot pull into a repository with state: {0}
  66. cannotRead=Cannot read {0}
  67. cannotReadBlob=Cannot read blob {0}
  68. cannotReadCommit=Cannot read commit {0}
  69. cannotReadFile=Cannot read file {0}
  70. cannotReadHEAD=cannot read HEAD: {0} {1}
  71. cannotReadObject=Cannot read object
  72. cannotReadTree=Cannot read tree {0}
  73. cannotRebaseWithoutCurrentHead=Can not rebase without a current HEAD
  74. cannotResolveLocalTrackingRefForUpdating=Cannot resolve local tracking ref {0} for updating.
  75. cannotStoreObjects=cannot store objects
  76. cannotUnloadAModifiedTree=Cannot unload a modified tree.
  77. cannotWorkWithOtherStagesThanZeroRightNow=Cannot work with other stages than zero right now. Won't write corrupt index.
  78. canOnlyCherryPickCommitsWithOneParent=Can only cherry-pick commits which have exactly one parent
  79. canOnlyRevertCommitsWithOneParent=Can only revert commits which have exactly one parent
  80. cantFindObjectInReversePackIndexForTheSpecifiedOffset=Can't find object in (reverse) pack index for the specified offset {0}
  81. cantPassMeATree=Can't pass me a tree!
  82. channelMustBeInRange0_255=channel {0} must be in range [0, 255]
  83. characterClassIsNotSupported=The character class {0} is not supported.
  84. checkoutConflictWithFile=Checkout conflict with file: {0}
  85. checkoutConflictWithFiles=Checkout conflict with files: {0}
  86. checkoutUnexpectedResult=Checkout returned unexpected result {0}
  87. classCastNotA=Not a {0}
  88. collisionOn=Collision on {0}
  89. commandWasCalledInTheWrongState=Command {0} was called in the wrong state
  90. commitAlreadyExists=exists {0}
  91. commitMessageNotSpecified=commit message not specified
  92. commitOnRepoWithoutHEADCurrentlyNotSupported=Commit on repo without HEAD currently not supported
  93. compressingObjects=Compressing objects
  94. connectionFailed=connection failed
  95. connectionTimeOut=Connection time out: {0}
  96. contextMustBeNonNegative=context must be >= 0
  97. corruptObjectBadStream=bad stream
  98. corruptObjectBadStreamCorruptHeader=bad stream, corrupt header
  99. corruptObjectGarbageAfterSize=garbage after size
  100. corruptObjectIncorrectLength=incorrect length
  101. corruptObjectInvalidEntryMode=invalid entry mode
  102. corruptObjectInvalidMode2=invalid mode {0}
  103. corruptObjectInvalidMode3=invalid mode {0} for {1} '{2}' in {3}.
  104. corruptObjectInvalidMode=invalid mode
  105. corruptObjectInvalidType2=invalid type {0}
  106. corruptObjectInvalidType=invalid type
  107. corruptObjectMalformedHeader=malformed header: {0}
  108. corruptObjectNegativeSize=negative size
  109. corruptObjectNoAuthor=no author
  110. corruptObjectNoCommitter=no committer
  111. corruptObjectNoHeader=no header
  112. corruptObjectNoObject=no object
  113. corruptObjectNoTagName=no tag name
  114. corruptObjectNoTaggerBadHeader=no tagger/bad header
  115. corruptObjectNoTaggerHeader=no tagger header
  116. corruptObjectNoType=no type
  117. corruptObjectNotree=no tree
  118. corruptObjectPackfileChecksumIncorrect=Packfile checksum incorrect.
  119. corruptionDetectedReReadingAt=Corruption detected re-reading at {0}
  120. couldNotCheckOutBecauseOfConflicts=Could not check out because of conflicts
  121. couldNotDeleteLockFileShouldNotHappen=Could not delete lock file. Should not happen
  122. couldNotDeleteTemporaryIndexFileShouldNotHappen=Could not delete temporary index file. Should not happen
  123. couldNotGetAdvertisedRef=Could not get advertised Ref for branch {0}
  124. couldNotLockHEAD=Could not lock HEAD
  125. couldNotReadIndexInOneGo=Could not read index in one go, only {0} out of {1} read
  126. couldNotReadObjectWhileParsingCommit=Could not read an object while parsing commit {0}
  127. couldNotRenameDeleteOldIndex=Could not rename delete old index
  128. couldNotRenameTemporaryFile=Could not rename temporary file {0} to new location {1}
  129. couldNotRenameTemporaryIndexFileToIndex=Could not rename temporary index file to index
  130. couldNotURLEncodeToUTF8=Could not URL encode to UTF-8
  131. couldNotWriteFile=Could not write file {0}
  132. countingObjects=Counting objects
  133. createBranchFailedUnknownReason=Create branch failed for unknown reason
  134. createBranchUnexpectedResult=Create branch returned unexpected result {0}
  135. createNewFileFailed=Could not create new file {0}
  136. credentialPassword=Password
  137. credentialUsername=Username
  138. daemonAlreadyRunning=Daemon already running
  139. deleteBranchUnexpectedResult=Delete branch returned unexpected result {0}
  140. deleteFileFailed=Could not delete file {0}
  141. deletingNotSupported=Deleting {0} not supported.
  142. destinationIsNotAWildcard=Destination is not a wildcard.
  143. detachedHeadDetected=HEAD is detached
  144. dirCacheDoesNotHaveABackingFile=DirCache does not have a backing file
  145. dirCacheFileIsNotLocked=DirCache {0} not locked
  146. dirCacheIsNotLocked=DirCache is not locked
  147. dirtyFilesExist=Dirty files exist. Refusing to merge
  148. doesNotHandleMode=Does not handle mode {0} ({1})
  149. downloadCancelled=Download cancelled
  150. downloadCancelledDuringIndexing=Download cancelled during indexing
  151. duplicateAdvertisementsOf=duplicate advertisements of {0}
  152. duplicateRef=Duplicate ref: {0}
  153. duplicateRemoteRefUpdateIsIllegal=Duplicate remote ref update is illegal. Affected remote name: {0}
  154. duplicateStagesNotAllowed=Duplicate stages not allowed
  155. eitherGitDirOrWorkTreeRequired=One of setGitDir or setWorkTree must be called.
  156. emptyPathNotPermitted=Empty path not permitted.
  157. encryptionError=Encryption error: {0}
  158. endOfFileInEscape=End of file in escape
  159. entryNotFoundByPath=Entry not found by path: {0}
  160. enumValueNotSupported2=Invalid value: {0}.{1}={2}
  161. enumValueNotSupported3=Invalid value: {0}.{1}.{2}={3}
  162. enumValuesNotAvailable=Enumerated values of type {0} not available
  163. errorDecodingFromFile=Error decoding from file {0}
  164. errorEncodingFromFile=Error encoding from file {0}
  165. errorInBase64CodeReadingStream=Error in Base64 code reading stream.
  166. errorInPackedRefs=error in packed-refs
  167. errorInvalidProtocolWantedOldNewRef=error: invalid protocol: wanted 'old new ref'
  168. errorListing=Error listing {0}
  169. errorOccurredDuringUnpackingOnTheRemoteEnd=error occurred during unpacking on the remote end: {0}
  170. errorReadingInfoRefs=error reading info/refs
  171. exceptionCaughtDuringExecutionOfAddCommand=Exception caught during execution of add command
  172. exceptionCaughtDuringExecutionOfCherryPickCommand=Exception caught during execution of cherry-pick command. {0}
  173. exceptionCaughtDuringExecutionOfCommitCommand=Exception caught during execution of commit command
  174. exceptionCaughtDuringExecutionOfFetchCommand=Exception caught during execution of fetch command
  175. exceptionCaughtDuringExecutionOfMergeCommand=Exception caught during execution of merge command. {0}
  176. exceptionCaughtDuringExecutionOfPushCommand=Exception caught during execution of push command
  177. exceptionCaughtDuringExecutionOfPullCommand=Exception caught during execution of pull command
  178. exceptionCaughtDuringExecutionOfRevertCommand=Exception caught during execution of revert command. {0}
  179. exceptionCaughtDuringExecutionOfRmCommand=Exception caught during execution of rm command
  180. exceptionCaughtDuringExecutionOfTagCommand=Exception caught during execution of tag command
  181. exceptionOccuredDuringAddingOfOptionToALogCommand=Exception occured during adding of {0} as option to a Log command
  182. exceptionOccuredDuringReadingOfGIT_DIR=Exception occured during reading of $GIT_DIR/{0}. {1}
  183. expectedACKNAKFoundEOF=Expected ACK/NAK, found EOF
  184. expectedACKNAKGot=Expected ACK/NAK, got: {0}
  185. expectedBooleanStringValue=Expected boolean string value
  186. expectedCharacterEncodingGuesses=Expected {0} character encoding guesses
  187. expectedEOFReceived=expected EOF; received '{0}' instead
  188. expectedGot=expected '{0}', got '{1}'
  189. expectedPktLineWithService=expected pkt-line with '# service=-', got '{0}'
  190. expectedReceivedContentType=expected Content-Type {0}; received Content-Type {1}
  191. expectedReportForRefNotReceived={0}: expected report for ref {1} not received
  192. failedUpdatingRefs=failed updating refs
  193. failureDueToOneOfTheFollowing=Failure due to one of the following:
  194. failureUpdatingFETCH_HEAD=Failure updating FETCH_HEAD: {0}
  195. failureUpdatingTrackingRef=Failure updating tracking ref {0}: {1}
  196. fileCannotBeDeleted=File cannot be deleted: {0}
  197. fileIsTooBigForThisConvenienceMethod=File is too big for this convenience method ({0} bytes).
  198. fileIsTooLarge=File is too large: {0}
  199. fileModeNotSetForPath=FileMode not set for path {0}
  200. flagIsDisposed={0} is disposed.
  201. flagNotFromThis={0} not from this.
  202. flagsAlreadyCreated={0} flags already created.
  203. funnyRefname=funny refname
  204. hugeIndexesAreNotSupportedByJgitYet=Huge indexes are not supported by jgit, yet
  205. hunkBelongsToAnotherFile=Hunk belongs to another file
  206. hunkDisconnectedFromFile=Hunk disconnected from file
  207. hunkHeaderDoesNotMatchBodyLineCountOf=Hunk header {0} does not match body line count of {1}
  208. illegalArgumentNotA=Not {0}
  209. illegalStateExists=exists {0}
  210. improperlyPaddedBase64Input=Improperly padded Base64 input.
  211. inMemoryBufferLimitExceeded=In-memory buffer limit exceeded
  212. incorrectHashFor=Incorrect hash for {0}; computed {1} as a {2} from {3} bytes.
  213. incorrectOBJECT_ID_LENGTH=Incorrect OBJECT_ID_LENGTH.
  214. incorrectObjectType_COMMITnorTREEnorBLOBnorTAG=COMMIT nor TREE nor BLOB nor TAG
  215. indexFileIsInUse=Index file is in use
  216. indexFileIsTooLargeForJgit=Index file is too large for jgit
  217. indexSignatureIsInvalid=Index signature is invalid: {0}
  218. indexWriteException=Modified index could not be written
  219. integerValueOutOfRange=Integer value {0}.{1} out of range
  220. internalRevisionError=internal revision error
  221. interruptedWriting=Interrupted writing {0}
  222. invalidAdvertisementOf=invalid advertisement of {0}
  223. invalidAncestryLength=Invalid ancestry length
  224. invalidBooleanValue=Invalid boolean value: {0}.{1}={2}
  225. invalidChannel=Invalid channel {0}
  226. invalidCharacterInBase64Data=Invalid character in Base64 data.
  227. invalidCommitParentNumber=Invalid commit parent number
  228. invalidEncryption=Invalid encryption
  229. invalidGitType=invalid git type: {0}
  230. invalidId=Invalid id {0}
  231. invalidIdLength=Invalid id length {0}; should be {1}
  232. invalidIntegerValue=Invalid integer value: {0}.{1}={2}
  233. invalidKey=Invalid key: {0}
  234. invalidLineInConfigFile=Invalid line in config file
  235. invalidModeFor=Invalid mode {0} for {1} {2} in {3}.
  236. invalidModeForPath=Invalid mode {0} for path {1}
  237. invalidObject=Invalid {0} {1}:{2}
  238. invalidOldIdSent=invalid old id sent
  239. invalidPacketLineHeader=Invalid packet line header: {0}
  240. invalidPath=Invalid path: {0}
  241. invalidRefName=Invalid ref name: {0}
  242. invalidRemote=Invalid remote: {0}
  243. invalidStageForPath=Invalid stage {0} for path {1}
  244. invalidTagOption=Invalid tag option: {0}
  245. invalidTimeout=Invalid timeout: {0}
  246. invalidURL=Invalid URL {0}
  247. invalidWildcards=Invalid wildcards {0}
  248. invalidWindowSize=Invalid window size
  249. isAStaticFlagAndHasNorevWalkInstance={0} is a static flag and has no RevWalk instance
  250. kNotInRange=k {0} not in {1} - {2}
  251. largeObjectException={0} exceeds size limit
  252. largeObjectOutOfMemory=Out of memory loading {0}
  253. largeObjectExceedsByteArray=Object {0} exceeds 2 GiB byte array limit
  254. largeObjectExceedsLimit=Object {0} exceeds {1} limit, actual size is {2}
  255. lengthExceedsMaximumArraySize=Length exceeds maximum array size
  256. listingAlternates=Listing alternates
  257. localObjectsIncomplete=Local objects incomplete.
  258. localRefIsMissingObjects=Local ref {0} is missing object(s).
  259. lockCountMustBeGreaterOrEqual1=lockCount must be >= 1
  260. lockError=lock error: {0}
  261. lockOnNotClosed=Lock on {0} not closed.
  262. lockOnNotHeld=Lock on {0} not held.
  263. malformedpersonIdentString=Malformed PersonIdent string (no < was found): {0}
  264. mergeConflictOnNotes=Merge conflict on note {0}. base = {1}, ours = {2}, theirs = {2}
  265. mergeConflictOnNonNoteEntries=Merge conflict on non-note entries: base = {0}, ours = {1}, theirs = {2}
  266. mergeStrategyAlreadyExistsAsDefault=Merge strategy "{0}" already exists as a default strategy
  267. mergeStrategyDoesNotSupportHeads=merge strategy {0} does not support {1} heads to be merged into HEAD
  268. mergeUsingStrategyResultedInDescription=Merge of revisions {0} with base {1} using strategy {2} resulted in: {3}. {4}
  269. missingAccesskey=Missing accesskey.
  270. missingConfigurationForKey=No value for key {0} found in configuration
  271. missingDeltaBase=delta base
  272. missingForwardImageInGITBinaryPatch=Missing forward-image in GIT binary patch
  273. missingObject=Missing {0} {1}
  274. missingPrerequisiteCommits=missing prerequisite commits:
  275. missingRequiredParameter=Parameter "{0}" is missing
  276. missingSecretkey=Missing secretkey.
  277. mixedStagesNotAllowed=Mixed stages not allowed
  278. mkDirFailed=Creating directory {0} failed
  279. mkDirsFailed=Creating directories for {0} failed
  280. multipleMergeBasesFor=Multiple merge bases for:\n {0}\n {1} found:\n {2}\n {3}
  281. need2Arguments=Need 2 arguments
  282. needPackOut=need packOut
  283. needsAtLeastOneEntry=Needs at least one entry
  284. needsWorkdir=Needs workdir
  285. newlineInQuotesNotAllowed=Newline in quotes not allowed
  286. noApplyInDelete=No apply in delete
  287. noClosingBracket=No closing {0} found for {1} at index {2}.
  288. noHEADExistsAndNoExplicitStartingRevisionWasSpecified=No HEAD exists and no explicit starting revision was specified
  289. noHMACsupport=No {0} support: {1}
  290. noMergeHeadSpecified=No merge head specified
  291. noSuchRef=no such ref
  292. noXMLParserAvailable=No XML parser available.
  293. notABoolean=Not a boolean: {0}
  294. notABundle=not a bundle
  295. notADIRCFile=Not a DIRC file.
  296. notAGitDirectory=not a git directory
  297. notAPACKFile=Not a PACK file.
  298. notARef=Not a ref: {0}: {1}
  299. notASCIIString=Not ASCII string: {0}
  300. notAuthorized=not authorized
  301. notAValidPack=Not a valid pack {0}
  302. notFound=not found.
  303. notValid={0} not valid
  304. nothingToFetch=Nothing to fetch.
  305. nothingToPush=Nothing to push.
  306. notMergedExceptionMessage=Branch was not deleted as it has not been merged yet; use the force option to delete it anyway
  307. objectAtHasBadZlibStream=Object at {0} in {1} has bad zlib stream
  308. objectAtPathDoesNotHaveId=Object at path "{0}" does not have an id assigned. All object ids must be assigned prior to writing a tree.
  309. objectIsCorrupt=Object {0} is corrupt: {1}
  310. objectIsNotA=Object {0} is not a {1}.
  311. objectNotFoundIn=Object {0} not found in {1}.
  312. obtainingCommitsForCherryPick=Obtaining commits that need to be cherry-picked
  313. offsetWrittenDeltaBaseForObjectNotFoundInAPack=Offset-written delta base for object not found in a pack
  314. onlyAlreadyUpToDateAndFastForwardMergesAreAvailable=only already-up-to-date and fast forward merges are available
  315. onlyOneFetchSupported=Only one fetch supported
  316. onlyOneOperationCallPerConnectionIsSupported=Only one operation call per connection is supported.
  317. openFilesMustBeAtLeast1=Open files must be >= 1
  318. openingConnection=Opening connection
  319. operationCanceled=Operation {0} was canceled
  320. outputHasAlreadyBeenStarted=Output has already been started.
  321. packChecksumMismatch=Pack checksum mismatch
  322. packCorruptedWhileWritingToFilesystem=Pack corrupted while writing to filesystem
  323. packDoesNotMatchIndex=Pack {0} does not match index
  324. packFileInvalid=Pack file invalid: {0}
  325. packHasUnresolvedDeltas=pack has unresolved deltas
  326. packObjectCountMismatch=Pack object count mismatch: pack {0} index {1}: {2}
  327. packTooLargeForIndexVersion1=Pack too large for index version 1
  328. packetSizeMustBeAtLeast=packet size {0} must be >= {1}
  329. packetSizeMustBeAtMost=packet size {0} must be <= {1}
  330. packfileCorruptionDetected=Packfile corruption detected: {0}
  331. packfileIsTruncated=Packfile is truncated.
  332. packingCancelledDuringObjectsWriting=Packing cancelled during objects writing
  333. packWriterStatistics=Total {0,number,#0} (delta {1,number,#0}), reused {2,number,#0} (delta {3,number,#0})
  334. pathIsNotInWorkingDir=Path is not in working dir
  335. peeledLineBeforeRef=Peeled line before ref.
  336. peerDidNotSupplyACompleteObjectGraph=peer did not supply a complete object graph
  337. prefixRemote=remote:
  338. problemWithResolvingPushRefSpecsLocally=Problem with resolving push ref specs locally: {0}
  339. progressMonUploading=Uploading {0}
  340. propertyIsAlreadyNonNull=Property is already non null
  341. pullTaskName=Pull
  342. pushCancelled=push cancelled
  343. pushIsNotSupportedForBundleTransport=Push is not supported for bundle transport
  344. pushNotPermitted=push not permitted
  345. rawLogMessageDoesNotParseAsLogEntry=Raw log message does not parse as log entry
  346. readTimedOut=Read timed out
  347. readingObjectsFromLocalRepositoryFailed=reading objects from local repository failed: {0}
  348. receivingObjects=Receiving objects
  349. refAlreadExists=Ref {0} already exists
  350. refNotResolved=Ref {0} can not be resolved
  351. refUpdateReturnCodeWas=RefUpdate return code was: {0}
  352. reflogsNotYetSupportedByRevisionParser=reflogs not yet supported by revision parser
  353. remoteConfigHasNoURIAssociated=Remote config "{0}" has no URIs associated
  354. remoteDoesNotHaveSpec=Remote does not have {0} available for fetch.
  355. remoteDoesNotSupportSmartHTTPPush=remote does not support smart HTTP push
  356. remoteHungUpUnexpectedly=remote hung up unexpectedly
  357. remoteNameCantBeNull=Remote name can't be null.
  358. renameBranchFailedBecauseTag=Can not rename as Ref {0} is a tag
  359. renameBranchFailedUnknownReason=Rename failed with unknown reason
  360. renameBranchUnexpectedResult=Unexpected rename result {0}
  361. renamesAlreadyFound=Renames have already been found.
  362. renamesBreakingModifies=Breaking apart modified file pairs
  363. renamesFindingByContent=Finding renames by content similarity
  364. renamesFindingExact=Finding exact renames
  365. renamesRejoiningModifies=Rejoining modified file pairs
  366. repositoryAlreadyExists=Repository already exists: {0}
  367. repositoryConfigFileInvalid=Repository config file {0} invalid {1}
  368. repositoryIsRequired=Repository is required.
  369. repositoryNotFound=repository not found: {0}
  370. repositoryState_applyMailbox=Apply mailbox
  371. repositoryState_bisecting=Bisecting
  372. repositoryState_conflicts=Conflicts
  373. repositoryState_merged=Merged
  374. repositoryState_normal=Normal
  375. repositoryState_rebase=Rebase
  376. repositoryState_rebaseInteractive=Rebase interactive
  377. repositoryState_rebaseOrApplyMailbox=Rebase/Apply mailbox
  378. repositoryState_rebaseWithMerge=Rebase w/merge
  379. requiredHashFunctionNotAvailable=Required hash function {0} not available.
  380. resolvingDeltas=Resolving deltas
  381. resettingHead=Resetting head to {0}
  382. resultLengthIncorrect=result length incorrect
  383. rewinding=Rewinding to commit {0}
  384. searchForReuse=Finding sources
  385. searchForSizes=Getting sizes
  386. sequenceTooLargeForDiffAlgorithm=Sequence too large for difference algorithm.
  387. serviceNotPermitted={0} not permitted
  388. shortCompressedStreamAt=Short compressed stream at {0}
  389. shortReadOfBlock=Short read of block.
  390. shortReadOfOptionalDIRCExtensionExpectedAnotherBytes=Short read of optional DIRC extension {0}; expected another {1} bytes within the section.
  391. shortSkipOfBlock=Short skip of block.
  392. signingNotSupportedOnTag=Signing isn't supported on tag operations yet.
  393. similarityScoreMustBeWithinBounds=Similarity score must be between 0 and 100.
  394. sizeExceeds2GB=Path {0} size {1} exceeds 2 GiB limit.
  395. smartHTTPPushDisabled=smart HTTP push disabled
  396. sourceDestinationMustMatch=Source/Destination must match.
  397. sourceIsNotAWildcard=Source is not a wildcard.
  398. sourceRefDoesntResolveToAnyObject=Source ref {0} doesn't resolve to any object.
  399. sourceRefNotSpecifiedForRefspec=Source ref not specified for refspec: {0}
  400. staleRevFlagsOn=Stale RevFlags on {0}
  401. startingReadStageWithoutWrittenRequestDataPendingIsNotSupported=Starting read stage without written request data pending is not supported
  402. statelessRPCRequiresOptionToBeEnabled=stateless RPC requires {0} to be enabled
  403. submodulesNotSupported=Submodules are not supported
  404. symlinkCannotBeWrittenAsTheLinkTarget=Symlink "{0}" cannot be written as the link target cannot be read from within Java.
  405. systemConfigFileInvalid=Systen wide config file {0} is invalid {1}
  406. tagNameInvalid=tag name {0} is invalid
  407. tagOnRepoWithoutHEADCurrentlyNotSupported=Tag on repository without HEAD currently not supported
  408. tSizeMustBeGreaterOrEqual1=tSize must be >= 1
  409. theFactoryMustNotBeNull=The factory must not be null
  410. timerAlreadyTerminated=Timer already terminated
  411. topologicalSortRequired=Topological sort required.
  412. transportExceptionBadRef=Empty ref: {0}: {1}
  413. transportExceptionEmptyRef=Empty ref: {0}
  414. transportExceptionInvalid=Invalid {0} {1}:{2}
  415. transportExceptionMissingAssumed=Missing assumed {0}
  416. transportExceptionReadRef=read {0}
  417. treeEntryAlreadyExists=Tree entry "{0}" already exists.
  418. treeIteratorDoesNotSupportRemove=TreeIterator does not support remove()
  419. truncatedHunkLinesMissingForAncestor=Truncated hunk, at least {0} lines missing for ancestor {1}
  420. truncatedHunkNewLinesMissing=Truncated hunk, at least {0} new lines is missing
  421. truncatedHunkOldLinesMissing=Truncated hunk, at least {0} old lines is missing
  422. unableToCheckConnectivity=Unable to check connectivity.
  423. unableToStore=Unable to store {0}.
  424. unableToWrite=Unable to write {0}
  425. unencodeableFile=Unencodeable file: {0}
  426. unexpectedCompareResult=Unexpected metadata comparison result: {0}
  427. unexpectedEndOfConfigFile=Unexpected end of config file
  428. unexpectedHunkTrailer=Unexpected hunk trailer
  429. unexpectedOddResult=odd: {0} + {1} - {2}
  430. unexpectedRefReport={0}: unexpected ref report: {1}
  431. unexpectedReportLine2={0} unexpected report line: {1}
  432. unexpectedReportLine=unexpected report line: {0}
  433. unknownDIRCVersion=Unknown DIRC version {0}
  434. unknownHost=unknown host
  435. unknownIndexVersionOrCorruptIndex=Unknown index version (or corrupt index): {0}
  436. unknownObject=unknown object
  437. unknownObjectType=Unknown object type {0}.
  438. unknownRepositoryFormat2=Unknown repository format "{0}"; expected "0".
  439. unknownRepositoryFormat=Unknown repository format
  440. unknownZlibError=Unknown zlib error.
  441. unpackException=Exception while parsing pack stream
  442. unmergedPath=Unmerged path: {0}
  443. unmergedPaths=Repository contains unmerged paths
  444. unreadablePackIndex=Unreadable pack index: {0}
  445. unrecognizedRef=Unrecognized ref: {0}
  446. unsupportedCommand0=unsupported command 0
  447. unsupportedEncryptionAlgorithm=Unsupported encryption algorithm: {0}
  448. unsupportedEncryptionVersion=Unsupported encryption version: {0}
  449. unsupportedOperationNotAddAtEnd=Not add-at-end: {0}
  450. unsupportedPackIndexVersion=Unsupported pack index version {0}
  451. unsupportedPackVersion=Unsupported pack version {0}.
  452. updatingRefFailed=Updating the ref {0} to {1} failed. ReturnCode from RefUpdate.update() was {2}
  453. uriNotFound={0} not found
  454. userConfigFileInvalid=User config file {0} invalid {1}
  455. walkFailure=Walk failure.
  456. windowSizeMustBeLesserThanLimit=Window size must be < limit
  457. windowSizeMustBePowerOf2=Window size must be power of 2
  458. writeTimedOut=Write timed out
  459. writerAlreadyInitialized=Writer already initialized
  460. writingNotPermitted=Writing not permitted
  461. writingNotSupported=Writing {0} not supported.
  462. writingObjects=Writing objects
  463. wrongDecompressedLength=wrong decompressed length
  464. wrongRepositoryState=Wrong Repository State: {0}