| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
| |
Using the Go Official tool `golang.org/x/tools/cmd/deadcode@latest`
mentioned by [go blog](https://go.dev/blog/deadcode).
Just use `deadcode .` in the project root folder and it gives a list of
unused functions. Though it has some false alarms.
This PR removes dead code detected in `models/issues`.
|
|
|
|
|
|
|
|
|
| |
Nowadays, cache will be used on almost everywhere of Gitea and it cannot
be disabled, otherwise some features will become unaviable.
Then I think we can just remove the option for cache enable. That means
cache cannot be disabled.
But of course, we can still use cache configuration to set how should
Gitea use the cache.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The 4 functions are duplicated, especially as interface methods. I think
we just need to keep `MustID` the only one and remove other 3.
```
MustID(b []byte) ObjectID
MustIDFromString(s string) ObjectID
NewID(b []byte) (ObjectID, error)
NewIDFromString(s string) (ObjectID, error)
```
Introduced the new interfrace method `ComputeHash` which will replace
the interface `HasherInterface`. Now we don't need to keep two
interfaces.
Reintroduced `git.NewIDFromString` and `git.MustIDFromString`. The new
function will detect the hash length to decide which objectformat of it.
If it's 40, then it's SHA1. If it's 64, then it's SHA256. This will be
right if the commitID is a full one. So the parameter should be always a
full commit id.
@AdamMajer Please review.
|
|
|
|
|
| |
Update golang.org/x/crypto for CVE-2023-48795 and update other packages.
`go-git` is not updated because it needs time to figure out why some
tests fail.
|
| |
|
|
|
|
|
|
|
|
| |
* Close #24483
* Close #28123
* Close #23682
* Close #23149
(maybe more)
|
|
|
|
|
|
|
| |
- Remove `ObjectFormatID`
- Remove function `ObjectFormatFromID`.
- Use `Sha1ObjectFormat` directly but not a pointer because it's an
empty struct.
- Store `ObjectFormatName` in `repository` struct
|
|
|
|
|
|
| |
Refactor Hash interfaces and centralize hash function. This will allow
easier introduction of different hash function later on.
This forms the "no-op" part of the SHA256 enablement patch.
|
|
|
|
| |
There could be a nil pointer exception if the file is not found because
that specific error is suppressed but not handled.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
## Changes
- Add deprecation warning to `Token` and `AccessToken` authentication
methods in swagger.
- Add deprecation warning header to API response. Example:
```
HTTP/1.1 200 OK
...
Warning: token and access_token API authentication is deprecated
...
```
- Add setting `DISABLE_QUERY_AUTH_TOKEN` to reject query string auth
tokens entirely. Default is `false`
## Next steps
- `DISABLE_QUERY_AUTH_TOKEN` should be true in a subsequent release and
the methods should be removed in swagger
- `DISABLE_QUERY_AUTH_TOKEN` should be removed and the implementation of
the auth methods in question should be removed
## Open questions
- Should there be further changes to the swagger documentation?
Deprecation is not yet supported for security definitions (coming in
[OpenAPI Spec version
3.2.0](https://github.com/OAI/OpenAPI-Specification/issues/2506))
- Should the API router logger sanitize urls that use `token` or
`access_token`? (This is obviously an insufficient solution on its own)
---------
Co-authored-by: delvh <dev.lh@web.de>
|
|
|
| |
Continue of #27798 and move more functions to `db.Find` and `db.Count`.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
- Currently there's code to recover gracefully from panics that happen
within the execution of cron tasks. However this recover code wasn't
being run, because `RunWithShutdownContext` also contains code to
recover from any panic and then gracefully shutdown Forgejo. Because
`RunWithShutdownContext` registers that code as last, that would get run
first which in this case is not behavior that we want.
- Move the recover code to inside the function, so that is run first
before `RunWithShutdownContext`'s recover code (which is now a noop).
Fixes: https://codeberg.org/forgejo/forgejo/issues/1910
Co-authored-by: Gusted <postmaster@gusted.xyz>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Fix #28056
This PR will check whether the repo has zero branch when pushing a
branch. If that, it means this repository hasn't been synced.
The reason caused that is after user upgrade from v1.20 -> v1.21, he
just push branches without visit the repository user interface. Because
all repositories routers will check whether a branches sync is necessary
but push has not such check.
For every repository, it has two states, synced or not synced. If there
is zero branch for a repository, then it will be assumed as non-sync
state. Otherwise, it's synced state. So if we think it's synced, we just
need to update branch/insert new branch. Otherwise do a full sync. So
that, for every push, there will be almost no extra load added. It's
high performance than yours.
For the implementation, we in fact will try to update the branch first,
if updated success with affect records > 0, then all are done. Because
that means the branch has been in the database. If no record is
affected, that means the branch does not exist in database. So there are
two possibilities. One is this is a new branch, then we just need to
insert the record. Another is the branches haven't been synced, then we
need to sync all the branches into database.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
parameter is zero and also introduce new generic methods (#28220)
The function `GetByBean` has an obvious defect that when the fields are
empty values, it will be ignored. Then users will get a wrong result
which is possibly used to make a security problem.
To avoid the possibility, this PR removed function `GetByBean` and all
references.
And some new generic functions have been introduced to be used.
The recommand usage like below.
```go
// if query an object according id
obj, err := db.GetByID[Object](ctx, id)
// query with other conditions
obj, err := db.Get[Object](ctx, builder.Eq{"a": a, "b":b})
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Fix #28328
```
func (p *PullRequestComment) GetDiffHunk() string {
if p == nil || p.DiffHunk == nil {
return ""
}
return *p.DiffHunk
}
```
This function in the package `go-github` may return an empty diff. When
it's empty, the following code will panic because it access `ss[1]`
https://github.com/go-gitea/gitea/blob/ec1feedbf582b05b6a5e8c59fb2457f25d053ba2/services/migrations/gitea_uploader.go#L861-L867
https://github.com/go-gitea/gitea/blob/ec1feedbf582b05b6a5e8c59fb2457f25d053ba2/modules/git/diff.go#L97-L101
|
|
|
|
|
|
|
| |
Fixes #28324
The name parameter can't contain some characters
(https://github.com/keybase/go-crypto/blob/master/openpgp/keys.go#L680)
but is optional. Therefore just use an empty string.
|
|
|
|
|
| |
Changing an issue status, assignee, labels or milestone without also
adding a comment would not update the index, resulting in wrong search
results.
|
|
|
|
|
|
|
|
|
|
|
| |
- Push commits updates are run in a queue and updates can come from less
traceable places such as Git over SSH, therefor add more information
about on which repository the pushUpdate failed.
Refs: https://codeberg.org/forgejo/forgejo/pulls/1723
(cherry picked from commit 37ab9460394800678d2208fed718e719d7a5d96f)
Co-authored-by: Gusted <postmaster@gusted.xyz>
|
|
|
|
|
|
|
|
|
| |
- Say to the binding middleware which locale should be used for the
required error.
- Resolves https://codeberg.org/forgejo/forgejo/issues/1683
(cherry picked from commit 5a2d7966127b5639332038e9925d858ab54fc360)
Co-authored-by: Gusted <postmaster@gusted.xyz>
|
|
|
|
| |
This PR will fix some missed checks for private repositories' data on
web routes and API routes.
|
|
|
|
| |
For those simple objects, it's unnecessary to write the find and count
methods again and again.
|
|
|
|
|
|
|
|
|
|
|
|
| |
Adds the possibility to skip workflow execution if the commit message
contains a string like [skip ci] or similar.
The default strings are the same as on GitHub, users can also set custom
ones in app.ini
Reference:
https://docs.github.com/en/actions/managing-workflow-runs/skipping-workflow-runs
Close #28020
|
|
|
|
|
|
|
|
|
|
| |
Fixes #28088
Fixes #28094
Added missing tests.
---------
Co-authored-by: Lunny Xiao <xiaolunwen@gmail.com>
|
|
|
|
|
| |
instead of owner ID (#28007)" (#28049)
This reverts commit #28007 60522fc96f1fa4675e95010e4b1535e0eac21910.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
owner ID (#28007)
Changed behavior to calculate package quota limit using package `creator
ID` instead of `owner ID`.
Currently, users are allowed to create an unlimited number of
organizations, each of which has its own package limit quota, resulting
in the ability for users to have unlimited package space in different
organization scopes. This fix will calculate package quota based on
`package version creator ID` instead of `package version owner ID`
(which might be organization), so that users are not allowed to take
more space than configured package settings.
Also, there is a side case in which users can publish packages to a
specific package version, initially published by different user, taking
that user package size quota. Version in fix should be better because
the total amount of space is limited to the quota for users sharing the
same organization scope.
|
|
|
|
|
|
|
| |
Fixes https://codeberg.org/forgejo/forgejo/issues/1458
Some mails such as issue creation mails are missing the reply-to-comment
address. This PR fixes that and specifies which comment types should get
a reply-possibility.
|
|
|
|
|
|
|
|
| |
Fixes #27819
We have support for two factor logins with the normal web login and with
basic auth. For basic auth the two factor check was implemented at three
different places and you need to know that this check is necessary. This
PR moves the check into the basic auth itself.
|
|
|
|
|
|
|
|
|
|
| |
- On user deletion, delete action runners that the user has created.
- Add a database consistency check to remove action runners that have
nonexistent belonging owner.
- Resolves https://codeberg.org/forgejo/forgejo/issues/1720
(cherry picked from commit 009ca7223dab054f7f760b7ccae69e745eebfabb)
Co-authored-by: Gusted <postmaster@gusted.xyz>
|
|
|
|
|
|
|
|
|
| |
We should not use `asset.ID` in DownloadFunc because DownloadFunc is a
closure.
https://github.com/go-gitea/gitea/blob/1bf5527eac6b947010c8faf408f6747de2a2384f/services/migrations/gitea_downloader.go#L284-L295
A similar bug when migrating from GitHub has been fixed in #14703. This
PR fixes the bug when migrating from Gitea and GitLab.
|
|
|
|
|
|
|
|
|
|
|
|
| |
unactive auth source (#27798)
The steps to reproduce it.
First, create a new oauth2 source.
Then, a user login with this oauth2 source.
Disable the oauth2 source.
Visit users -> settings -> security, 500 will be displayed.
This is because this page only load active Oauth2 sources but not all
Oauth2 sources.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
After many refactoring PRs for the "locale" and "template context
function", now the ".locale" is not needed for web templates any more.
This PR does a clean up for:
1. Remove `ctx.Data["locale"]` for web context.
2. Use `ctx.Locale` in `500.tmpl`, for consistency.
3. Add a test check for `500 page` locale usage.
4. Remove the `Str2html` and `DotEscape` from mail template context
data, they are copy&paste errors introduced by #19169 and #16200 . These
functions are template functions (provided by the common renderer), but
not template data variables.
5. Make email `SendAsync` function mockable (I was planning to add more
tests but it would make this PR much too complex, so the tests could be
done in another PR)
|
|
|
|
|
|
|
| |
Fix #23742
---------
Co-authored-by: KN4CK3R <admin@oldschoolhack.me>
|
|
|
|
|
|
| |
Closes #27783
This PR lists all and not only the latest package versions in the
`Packages` index.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Due to a bug in the GitLab API, the diff_refs field is populated in the
response when fetching an individual merge request, but not when
fetching a list of them. That field is used to populate the merge base
commit SHA.
While there is detection for the merge base even when not populated by
the downloader, that detection is not flawless. Specifically, when a
GitLab merge request has a single commit, and gets merged with the
squash strategy, the base branch will be fast-forwarded instead of a
separate squash or merge commit being created. The merge base detection
attempts to find the last commit on the base branch that is also on the
PR branch, but in the fast-forward case that is the PR's only commit.
Assuming the head commit is also the merge base results in the import of
a PR with 0 commits and no diff.
This PR uses the individual merge request endpoint to fetch merge
request data with the diff_refs field. With its data, the base merge
commit can be properly set, which—by not relying on the detection
mentioned above—correctly imports PRs that were "merged" by
fast-forwarding the base branch.
ref: https://gitlab.com/gitlab-org/gitlab/-/issues/29620
|
|
|
|
|
|
|
|
|
|
| |
Before this PR, the PR migration code populates Gitea's MergedCommitID
field by using GitLab's merge_commit_sha field. However, that field is
only populated when the PR was merged using a merge strategy. When a
squash strategy is used, squash_commit_sha is populated instead.
Given that Gitea does not keep track of merge and squash commits
separately, this PR simply populates Gitea's MergedCommitID by using
whichever field is present in the GitLab API response.
|
|
|
|
| |
Add missing `.Close()` calls. The current code does not delete the
temporary files if the data grows over 32mb.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Hello there,
Cargo Index over HTTP is now prefered over git for package updates: we
should not force users who do not need the GIT repo to have the repo
created/updated on each publish (it can still be created in the packages
settings).
The current behavior when publishing is to check if the repo exist and
create it on the fly if not, then update it's content.
Cargo HTTP Index does not rely on the repo itself so this will be
useless for everyone not using the git protocol for cargo registry.
This PR only disable the creation on the fly of the repo when publishing
a crate.
This is linked to #26844 (error 500 when trying to publish a crate if
user is missing write access to the repo) because it's now optional.
---------
Co-authored-by: KN4CK3R <admin@oldschoolhack.me>
|
|
|
| |
https://github.com/golangci/golangci-lint/releases/tag/v1.55.0
|
| |
|
|
|
|
| |
to address #27273
replace #24873
|
|
|
|
|
|
|
| |
Fixes https://codeberg.org/forgejo/forgejo/issues/1514
I had to remove `RenameOrganization` to avoid circular import.
We should really add some foreign keys to the database.
|
| |
|
|
|
| |
Fixes #27650
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
When `webhook.PROXY_URL` has been set, the old code will check if the
proxy host is in `ALLOWED_HOST_LIST` or reject requests through the
proxy. It requires users to add the proxy host to `ALLOWED_HOST_LIST`.
However, it actually allows all requests to any port on the host, when
the proxy host is probably an internal address.
But things may be even worse. `ALLOWED_HOST_LIST` doesn't really work
when requests are sent to the allowed proxy, and the proxy could forward
them to any hosts.
This PR fixes it by:
- If the proxy has been set, always allow connectioins to the host and
port.
- Check `ALLOWED_HOST_LIST` before forwarding.
|
| |
|
|
|
| |
Fix #27540
|
|
|
| |
Target #27065
|
|
|
| |
Last part of #27065
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Closes #27455
> The mechanism responsible for long-term authentication (the 'remember
me' cookie) uses a weak construction technique. It will hash the user's
hashed password and the rands value; it will then call the secure cookie
code, which will encrypt the user's name with the computed hash. If one
were able to dump the database, they could extract those two values to
rebuild that cookie and impersonate a user. That vulnerability exists
from the date the dump was obtained until a user changed their password.
>
> To fix this security issue, the cookie could be created and verified
using a different technique such as the one explained at
https://paragonie.com/blog/2015/04/secure-authentication-php-with-long-term-persistence#secure-remember-me-cookies.
The PR removes the now obsolete setting `COOKIE_USERNAME`.
|
| |
|