[v2] Convert apt source fetcher into native bitbake variant

Message ID dc3375ca-e53c-4774-bf2a-2489acf6a775@siemens.com
State Superseded, archived
Headers show
Series [v2] Convert apt source fetcher into native bitbake variant | expand

Commit Message

Jan Kiszka Nov. 15, 2024, 4:40 p.m. UTC
From: Jan Kiszka <jan.kiszka@siemens.com>

There is no major functional difference, but we no longer have to
manipulate SRC_URI by registering an official fetcher for apt://.

As the fetching no longer takes place in separate tasks, do_fetch and
do_unpack need to gain the extra flags that were so far assigned to
apt_fetch and apt_unpack. That happens conditionally, i.e. only if
SRC_URI actually contains an apt URL.

One difference to the original version is the possibility - even if
practically of minor relevance - to unpack multiple apt sources into S.
The old version contained a loop but was directing dpkg-source to a
pre-existing dir which would have failed on the second iteration. The
new version now folds the results together after each step.

Another minor difference is that unversioned fetches put their results
into the same subfolder in DL_DIR, also when specifying a distro
codename. Only versioned fetches get dedicated folders (and .done
stamps).

There is no progress report realized because dpkg-source unfortunately
does not provide information upfront to make this predictable, thus
expressible in form of percentage.

Signed-off-by: Jan Kiszka <jan.kiszka@siemens.com>
---

Changes in v2:
 - rebased, including the removal of isar-apt sources in apt_unpack

I'm carefully optimistic that this change also resolves the previously 
seen issue in CI.

 meta/classes/dpkg-base.bbclass | 104 ++++-----------------------------
 meta/lib/aptsrc_fetcher.py     |  93 +++++++++++++++++++++++++++++
 2 files changed, 104 insertions(+), 93 deletions(-)
 create mode 100644 meta/lib/aptsrc_fetcher.py

Comments

Uladzimir Bely Nov. 27, 2024, 2:07 p.m. UTC | #1
On Fri, 2024-11-15 at 17:40 +0100, Jan Kiszka wrote:
> From: Jan Kiszka <jan.kiszka@siemens.com>
> 
> There is no major functional difference, but we no longer have to
> manipulate SRC_URI by registering an official fetcher for apt://.
> 
> As the fetching no longer takes place in separate tasks, do_fetch and
> do_unpack need to gain the extra flags that were so far assigned to
> apt_fetch and apt_unpack. That happens conditionally, i.e. only if
> SRC_URI actually contains an apt URL.
> 
> One difference to the original version is the possibility - even if
> practically of minor relevance - to unpack multiple apt sources into
> S.
> The old version contained a loop but was directing dpkg-source to a
> pre-existing dir which would have failed on the second iteration. The
> new version now folds the results together after each step.
> 
> Another minor difference is that unversioned fetches put their
> results
> into the same subfolder in DL_DIR, also when specifying a distro
> codename. Only versioned fetches get dedicated folders (and .done
> stamps).
> 
> There is no progress report realized because dpkg-source
> unfortunately
> does not provide information upfront to make this predictable, thus
> expressible in form of percentage.
> 
> Signed-off-by: Jan Kiszka <jan.kiszka@siemens.com>
> ---
> 
> Changes in v2:
>  - rebased, including the removal of isar-apt sources in apt_unpack
> 
> I'm carefully optimistic that this change also resolves the
> previously 
> seen issue in CI.
> 
>  meta/classes/dpkg-base.bbclass | 104 ++++---------------------------
> --
>  meta/lib/aptsrc_fetcher.py     |  93 +++++++++++++++++++++++++++++
>  2 files changed, 104 insertions(+), 93 deletions(-)
>  create mode 100644 meta/lib/aptsrc_fetcher.py
> 
> diff --git a/meta/classes/dpkg-base.bbclass b/meta/classes/dpkg-
> base.bbclass
> index b4ea8e17..c02c07a8 100644
> --- a/meta/classes/dpkg-base.bbclass
> +++ b/meta/classes/dpkg-base.bbclass
> @@ -79,110 +79,28 @@ do_adjust_git[lockfiles] +=
> "${DL_DIR}/git/isar.lock"
>  inherit patch
>  addtask patch after do_adjust_git
>  
> -SRC_APT ?= ""
> -
> -# filter out all "apt://" URIs out of SRC_URI and stick them into
> SRC_APT
>  python() {
> -    src_uri = (d.getVar('SRC_URI', False) or "").split()
> +    from bb.fetch2 import methods
>  
> -    prefix = "apt://"
> -    src_apt = []
> -    for u in src_uri:
> -        if u.startswith(prefix):
> -            src_apt.append(u[len(prefix) :])
> -            d.setVar('SRC_URI:remove', u)
> +    # apt-src fetcher
> +    import aptsrc_fetcher
> +    methods.append(aptsrc_fetcher.AptSrc())
>  
> -    d.prependVar('SRC_APT', ' '.join(src_apt))
> +    src_uri = (d.getVar('SRC_URI', False) or "").split()
> +    for u in src_uri:
> +        if u.startswith("apt://"):
> +            d.appendVarFlag('do_fetch', 'depends',
> d.getVar('SCHROOT_DEP'))
>  
> -    if len(d.getVar('SRC_APT').strip()) > 0:
> -        bb.build.addtask('apt_unpack', 'do_patch', '', d)
> -        bb.build.addtask('cleanall_apt', 'do_cleanall', '', d)
> +            d.appendVarFlag('do_unpack', 'cleandirs', d.getVar('S'))
> +            d.setVarFlag('do_unpack', 'network',
> d.getVar('TASK_USE_SUDO'))
> +            break
>  
>      # container docker fetcher
>      import container_fetcher
> -    from bb.fetch2 import methods
>  
>      methods.append(container_fetcher.Container())
>  }
>  
> -do_apt_fetch() {
> -    E="${@ isar_export_proxies(d)}"
> -    schroot_create_configs
> -
> -    session_id=$(schroot -q -b -c ${SBUILD_CHROOT})
> -    echo "Started session: ${session_id}"
> -
> -    schroot_cleanup() {
> -        schroot -q -f -e -c ${session_id} > /dev/null 2>&1
> -        schroot_delete_configs
> -    }
> -    trap 'exit 1' INT HUP QUIT TERM ALRM USR1
> -    trap 'schroot_cleanup' EXIT
> -
> -    schroot -r -c ${session_id} -d / -u root -- \
> -        rm /etc/apt/sources.list.d/isar-apt.list
> /etc/apt/preferences.d/isar-apt
> -    schroot -r -c ${session_id} -d / -- \
> -        sh -c '
> -            set -e
> -            for uri in $2; do
> -                mkdir -p /downloads/deb-src/"$1"/${uri}
> -                cd /downloads/deb-src/"$1"/${uri}
> -                apt-get -y --download-only --only-source source
> ${uri}
> -            done' \
> -                my_script "${BASE_DISTRO}-${BASE_DISTRO_CODENAME}"
> "${SRC_APT}"
> -
> -    schroot -e -c ${session_id}
> -    schroot_delete_configs
> -}
> -
> -addtask apt_fetch
> -do_apt_fetch[lockfiles] += "${REPO_ISAR_DIR}/isar.lock"
> -do_apt_fetch[network] = "${TASK_USE_NETWORK_AND_SUDO}"
> -
> -# Add dependency from the correct schroot: host or target
> -do_apt_fetch[depends] += "${SCHROOT_DEP}"
> -
> -do_apt_unpack() {
> -    rm -rf ${S}
> -    schroot_create_configs
> -
> -    session_id=$(schroot -q -b -c ${SBUILD_CHROOT})
> -    echo "Started session: ${session_id}"
> -
> -    schroot_cleanup() {
> -        schroot -q -f -e -c ${session_id} > /dev/null 2>&1
> -        schroot_delete_configs
> -    }
> -    trap 'exit 1' INT HUP QUIT TERM ALRM USR1
> -    trap 'schroot_cleanup' EXIT
> -
> -    schroot -r -c ${session_id} -d / -u root -- \
> -        rm /etc/apt/sources.list.d/isar-apt.list
> /etc/apt/preferences.d/isar-apt
> -    schroot -r -c ${session_id} -d / -- \
> -        sh -c '
> -            set -e
> -            for uri in $2; do
> -                dscfile="$(apt-get -y -qq --print-uris --only-source
> source $uri | cut -d " " -f2 | grep -E "*.dsc")"
> -                cd ${PP}
> -                cp /downloads/deb-src/"${1}"/${uri}/* ${PP}
> -                dpkg-source -x "${dscfile}" "${PPS}"
> -            done' \
> -                my_script "${BASE_DISTRO}-${BASE_DISTRO_CODENAME}"
> "${SRC_APT}"
> -
> -    schroot -e -c ${session_id}
> -    schroot_delete_configs
> -}
> -do_apt_unpack[network] = "${TASK_USE_SUDO}"
> -
> -addtask apt_unpack after do_apt_fetch
> -
> -do_cleanall_apt[nostamp] = "1"
> -do_cleanall_apt() {
> -    for uri in "${SRC_APT}"; do
> -        rm -rf "${DEBSRCDIR}/${BASE_DISTRO}-
> ${BASE_DISTRO_CODENAME}/$uri"
> -    done
> -}
> -
>  def get_package_srcdir(d):
>      s = os.path.abspath(d.getVar("S"))
>      workdir = os.path.abspath(d.getVar("WORKDIR"))
> diff --git a/meta/lib/aptsrc_fetcher.py b/meta/lib/aptsrc_fetcher.py
> new file mode 100644
> index 00000000..ee726202
> --- /dev/null
> +++ b/meta/lib/aptsrc_fetcher.py
> @@ -0,0 +1,93 @@
> +# This software is a part of ISAR.
> +# Copyright (c) Siemens AG, 2024
> +#
> +# SPDX-License-Identifier: MIT
> +
> +from bb.fetch2 import FetchError
> +from bb.fetch2 import FetchMethod
> +from bb.fetch2 import logger
> +from bb.fetch2 import runfetchcmd
> +
> +class AptSrc(FetchMethod):
> +    def supports(self, ud, d):
> +        return ud.type in ['apt']
> +
> +    def urldata_init(self, ud, d):
> +        ud.src_package = ud.url[len('apt://'):]
> +        ud.host = ud.host.replace('=', '_')
> +
> +        base_distro = d.getVar('BASE_DISTRO')
> +
> +        # For these distros we know that the same version means the
> same
> +        # source package, also across distro releases.
> +        distro_suffix = '' if base_distro in ['debian', 'ubuntu']
> else \
> +            '-' + d.getVar('BASE_DISTRO_CODENAME')
> +
> +        ud.localfile='deb-src/' + base_distro + distro_suffix + '/'
> + ud.host
> +
> +    def download(self, ud, d):
> +        bb.utils.exec_flat_python_func('isar_export_proxies', d)
> +        bb.build.exec_func('schroot_create_configs', d)
> +
> +        sbuild_chroot = d.getVar('SBUILD_CHROOT')
> +        session_id = runfetchcmd(f'schroot -q -b -c
> {sbuild_chroot}', d).strip()
> +        logger.info(f'Started session: {session_id}')
> +
> +        repo_isar_dir = d.getVar('REPO_ISAR_DIR')
> +        lockfile = bb.utils.lockfile(f'{repo_isar_dir}/isar.lock')
> +
> +        try:
> +            runfetchcmd(f'''
> +                set -e
> +                schroot -r -c {session_id} -d / -u root -- \
> +                    rm /etc/apt/sources.list.d/isar-apt.list
> /etc/apt/preferences.d/isar-apt
> +                schroot -r -c {session_id} -d / -- \
> +                    sh -c '
> +                        set -e
> +                        mkdir -p /downloads/{ud.localfile}
> +                        cd /downloads/{ud.localfile}
> +                        apt-get -y --download-only --only-source
> source {ud.src_package}
> +                        '
> +                ''', d)
> +        except (OSError, FetchError):
> +            raise
> +        finally:
> +            bb.utils.unlockfile(lockfile)
> +            runfetchcmd(f'schroot -q -f -e -c {session_id}', d)
> +            bb.build.exec_func('schroot_delete_configs', d)
> +
> +    def unpack(self, ud, rootdir, d):
> +        bb.build.exec_func('schroot_create_configs', d)
> +
> +        sbuild_chroot = d.getVar('SBUILD_CHROOT')
> +        session_id = runfetchcmd(f'schroot -q -b -c
> {sbuild_chroot}', d).strip()
> +        logger.info(f'Started session: {session_id}')
> +
> +        pp = d.getVar('PP')
> +        pps = d.getVar('PPS')
> +        try:
> +            runfetchcmd(f'''
> +                set -e
> +                schroot -r -c {session_id} -d / -u root -- \
> +                    rm /etc/apt/sources.list.d/isar-apt.list
> /etc/apt/preferences.d/isar-apt
> +                schroot -r -c {session_id} -d / -- \
> +                    sh -c '
> +                        set -e
> +                        dscfile=$(apt-get -y -qq --print-uris --
> only-source source {ud.src_package} | \
> +                                  cut -d " " -f2 | grep -E "\.dsc")
> +                        cp /downloads/{ud.localfile}/* {pp}
> +                        cd {pp}
> +                        mv -f {pps} {pps}.prev
> +                        dpkg-source -x "$dscfile" {pps}

Hello.

This still fails in CI, but this time I had some time to find the root
cause.

The problem is that buster(bullseye) and bookworm(trixie) provide
different versions of "hello" package.

If we first build e.g. `mc:qemuamd64-bookworm:hello`, hello_2.10-3.dsc
is downloaded and the whole "downloads/deb-src/debian/hello/" is
considered finished with "downloads/deb-src/debian/hello.done" flag. 

So, when e.g. `mc:qemuamd64-bullseye:hello` build follows, it doesn't
download hello_2.10-2.dsc an results in dpkg-source error.

It doesn't matter if we build both targets in parallel or sequentially,
the latest always fails.

> +                        find {pps}.prev -mindepth 1 -maxdepth 1 -
> exec mv {{}} {pps}/ \;
> +                        rmdir {pps}.prev
> +                        '
> +                ''', d)
> +        except (OSError, FetchError):
> +            raise
> +        finally:
> +            runfetchcmd(f'schroot -q -f -e -c {session_id}', d)
> +            bb.build.exec_func('schroot_delete_configs', d)
> +
> +    def clean(self, ud, d):
> +        bb.utils.remove(ud.localpath, recurse=True)
Jan Kiszka Nov. 28, 2024, 4:55 a.m. UTC | #2
On 27.11.24 22:07, Uladzimir Bely wrote:
> On Fri, 2024-11-15 at 17:40 +0100, Jan Kiszka wrote:
>> From: Jan Kiszka <jan.kiszka@siemens.com>
>>
>> There is no major functional difference, but we no longer have to
>> manipulate SRC_URI by registering an official fetcher for apt://.
>>
>> As the fetching no longer takes place in separate tasks, do_fetch and
>> do_unpack need to gain the extra flags that were so far assigned to
>> apt_fetch and apt_unpack. That happens conditionally, i.e. only if
>> SRC_URI actually contains an apt URL.
>>
>> One difference to the original version is the possibility - even if
>> practically of minor relevance - to unpack multiple apt sources into
>> S.
>> The old version contained a loop but was directing dpkg-source to a
>> pre-existing dir which would have failed on the second iteration. The
>> new version now folds the results together after each step.
>>
>> Another minor difference is that unversioned fetches put their
>> results
>> into the same subfolder in DL_DIR, also when specifying a distro
>> codename. Only versioned fetches get dedicated folders (and .done
>> stamps).
>>
>> There is no progress report realized because dpkg-source
>> unfortunately
>> does not provide information upfront to make this predictable, thus
>> expressible in form of percentage.
>>
>> Signed-off-by: Jan Kiszka <jan.kiszka@siemens.com>
>> ---
>>
>> Changes in v2:
>>  - rebased, including the removal of isar-apt sources in apt_unpack
>>
>> I'm carefully optimistic that this change also resolves the
>> previously 
>> seen issue in CI.
>>
>>  meta/classes/dpkg-base.bbclass | 104 ++++---------------------------
>> --
>>  meta/lib/aptsrc_fetcher.py     |  93 +++++++++++++++++++++++++++++
>>  2 files changed, 104 insertions(+), 93 deletions(-)
>>  create mode 100644 meta/lib/aptsrc_fetcher.py
>>
>> diff --git a/meta/classes/dpkg-base.bbclass b/meta/classes/dpkg-
>> base.bbclass
>> index b4ea8e17..c02c07a8 100644
>> --- a/meta/classes/dpkg-base.bbclass
>> +++ b/meta/classes/dpkg-base.bbclass
>> @@ -79,110 +79,28 @@ do_adjust_git[lockfiles] +=
>> "${DL_DIR}/git/isar.lock"
>>  inherit patch
>>  addtask patch after do_adjust_git
>>  
>> -SRC_APT ?= ""
>> -
>> -# filter out all "apt://" URIs out of SRC_URI and stick them into
>> SRC_APT
>>  python() {
>> -    src_uri = (d.getVar('SRC_URI', False) or "").split()
>> +    from bb.fetch2 import methods
>>  
>> -    prefix = "apt://"
>> -    src_apt = []
>> -    for u in src_uri:
>> -        if u.startswith(prefix):
>> -            src_apt.append(u[len(prefix) :])
>> -            d.setVar('SRC_URI:remove', u)
>> +    # apt-src fetcher
>> +    import aptsrc_fetcher
>> +    methods.append(aptsrc_fetcher.AptSrc())
>>  
>> -    d.prependVar('SRC_APT', ' '.join(src_apt))
>> +    src_uri = (d.getVar('SRC_URI', False) or "").split()
>> +    for u in src_uri:
>> +        if u.startswith("apt://"):
>> +            d.appendVarFlag('do_fetch', 'depends',
>> d.getVar('SCHROOT_DEP'))
>>  
>> -    if len(d.getVar('SRC_APT').strip()) > 0:
>> -        bb.build.addtask('apt_unpack', 'do_patch', '', d)
>> -        bb.build.addtask('cleanall_apt', 'do_cleanall', '', d)
>> +            d.appendVarFlag('do_unpack', 'cleandirs', d.getVar('S'))
>> +            d.setVarFlag('do_unpack', 'network',
>> d.getVar('TASK_USE_SUDO'))
>> +            break
>>  
>>      # container docker fetcher
>>      import container_fetcher
>> -    from bb.fetch2 import methods
>>  
>>      methods.append(container_fetcher.Container())
>>  }
>>  
>> -do_apt_fetch() {
>> -    E="${@ isar_export_proxies(d)}"
>> -    schroot_create_configs
>> -
>> -    session_id=$(schroot -q -b -c ${SBUILD_CHROOT})
>> -    echo "Started session: ${session_id}"
>> -
>> -    schroot_cleanup() {
>> -        schroot -q -f -e -c ${session_id} > /dev/null 2>&1
>> -        schroot_delete_configs
>> -    }
>> -    trap 'exit 1' INT HUP QUIT TERM ALRM USR1
>> -    trap 'schroot_cleanup' EXIT
>> -
>> -    schroot -r -c ${session_id} -d / -u root -- \
>> -        rm /etc/apt/sources.list.d/isar-apt.list
>> /etc/apt/preferences.d/isar-apt
>> -    schroot -r -c ${session_id} -d / -- \
>> -        sh -c '
>> -            set -e
>> -            for uri in $2; do
>> -                mkdir -p /downloads/deb-src/"$1"/${uri}
>> -                cd /downloads/deb-src/"$1"/${uri}
>> -                apt-get -y --download-only --only-source source
>> ${uri}
>> -            done' \
>> -                my_script "${BASE_DISTRO}-${BASE_DISTRO_CODENAME}"
>> "${SRC_APT}"
>> -
>> -    schroot -e -c ${session_id}
>> -    schroot_delete_configs
>> -}
>> -
>> -addtask apt_fetch
>> -do_apt_fetch[lockfiles] += "${REPO_ISAR_DIR}/isar.lock"
>> -do_apt_fetch[network] = "${TASK_USE_NETWORK_AND_SUDO}"
>> -
>> -# Add dependency from the correct schroot: host or target
>> -do_apt_fetch[depends] += "${SCHROOT_DEP}"
>> -
>> -do_apt_unpack() {
>> -    rm -rf ${S}
>> -    schroot_create_configs
>> -
>> -    session_id=$(schroot -q -b -c ${SBUILD_CHROOT})
>> -    echo "Started session: ${session_id}"
>> -
>> -    schroot_cleanup() {
>> -        schroot -q -f -e -c ${session_id} > /dev/null 2>&1
>> -        schroot_delete_configs
>> -    }
>> -    trap 'exit 1' INT HUP QUIT TERM ALRM USR1
>> -    trap 'schroot_cleanup' EXIT
>> -
>> -    schroot -r -c ${session_id} -d / -u root -- \
>> -        rm /etc/apt/sources.list.d/isar-apt.list
>> /etc/apt/preferences.d/isar-apt
>> -    schroot -r -c ${session_id} -d / -- \
>> -        sh -c '
>> -            set -e
>> -            for uri in $2; do
>> -                dscfile="$(apt-get -y -qq --print-uris --only-source
>> source $uri | cut -d " " -f2 | grep -E "*.dsc")"
>> -                cd ${PP}
>> -                cp /downloads/deb-src/"${1}"/${uri}/* ${PP}
>> -                dpkg-source -x "${dscfile}" "${PPS}"
>> -            done' \
>> -                my_script "${BASE_DISTRO}-${BASE_DISTRO_CODENAME}"
>> "${SRC_APT}"
>> -
>> -    schroot -e -c ${session_id}
>> -    schroot_delete_configs
>> -}
>> -do_apt_unpack[network] = "${TASK_USE_SUDO}"
>> -
>> -addtask apt_unpack after do_apt_fetch
>> -
>> -do_cleanall_apt[nostamp] = "1"
>> -do_cleanall_apt() {
>> -    for uri in "${SRC_APT}"; do
>> -        rm -rf "${DEBSRCDIR}/${BASE_DISTRO}-
>> ${BASE_DISTRO_CODENAME}/$uri"
>> -    done
>> -}
>> -
>>  def get_package_srcdir(d):
>>      s = os.path.abspath(d.getVar("S"))
>>      workdir = os.path.abspath(d.getVar("WORKDIR"))
>> diff --git a/meta/lib/aptsrc_fetcher.py b/meta/lib/aptsrc_fetcher.py
>> new file mode 100644
>> index 00000000..ee726202
>> --- /dev/null
>> +++ b/meta/lib/aptsrc_fetcher.py
>> @@ -0,0 +1,93 @@
>> +# This software is a part of ISAR.
>> +# Copyright (c) Siemens AG, 2024
>> +#
>> +# SPDX-License-Identifier: MIT
>> +
>> +from bb.fetch2 import FetchError
>> +from bb.fetch2 import FetchMethod
>> +from bb.fetch2 import logger
>> +from bb.fetch2 import runfetchcmd
>> +
>> +class AptSrc(FetchMethod):
>> +    def supports(self, ud, d):
>> +        return ud.type in ['apt']
>> +
>> +    def urldata_init(self, ud, d):
>> +        ud.src_package = ud.url[len('apt://'):]
>> +        ud.host = ud.host.replace('=', '_')
>> +
>> +        base_distro = d.getVar('BASE_DISTRO')
>> +
>> +        # For these distros we know that the same version means the
>> same
>> +        # source package, also across distro releases.
>> +        distro_suffix = '' if base_distro in ['debian', 'ubuntu']
>> else \
>> +            '-' + d.getVar('BASE_DISTRO_CODENAME')
>> +
>> +        ud.localfile='deb-src/' + base_distro + distro_suffix + '/'
>> + ud.host
>> +
>> +    def download(self, ud, d):
>> +        bb.utils.exec_flat_python_func('isar_export_proxies', d)
>> +        bb.build.exec_func('schroot_create_configs', d)
>> +
>> +        sbuild_chroot = d.getVar('SBUILD_CHROOT')
>> +        session_id = runfetchcmd(f'schroot -q -b -c
>> {sbuild_chroot}', d).strip()
>> +        logger.info(f'Started session: {session_id}')
>> +
>> +        repo_isar_dir = d.getVar('REPO_ISAR_DIR')
>> +        lockfile = bb.utils.lockfile(f'{repo_isar_dir}/isar.lock')
>> +
>> +        try:
>> +            runfetchcmd(f'''
>> +                set -e
>> +                schroot -r -c {session_id} -d / -u root -- \
>> +                    rm /etc/apt/sources.list.d/isar-apt.list
>> /etc/apt/preferences.d/isar-apt
>> +                schroot -r -c {session_id} -d / -- \
>> +                    sh -c '
>> +                        set -e
>> +                        mkdir -p /downloads/{ud.localfile}
>> +                        cd /downloads/{ud.localfile}
>> +                        apt-get -y --download-only --only-source
>> source {ud.src_package}
>> +                        '
>> +                ''', d)
>> +        except (OSError, FetchError):
>> +            raise
>> +        finally:
>> +            bb.utils.unlockfile(lockfile)
>> +            runfetchcmd(f'schroot -q -f -e -c {session_id}', d)
>> +            bb.build.exec_func('schroot_delete_configs', d)
>> +
>> +    def unpack(self, ud, rootdir, d):
>> +        bb.build.exec_func('schroot_create_configs', d)
>> +
>> +        sbuild_chroot = d.getVar('SBUILD_CHROOT')
>> +        session_id = runfetchcmd(f'schroot -q -b -c
>> {sbuild_chroot}', d).strip()
>> +        logger.info(f'Started session: {session_id}')
>> +
>> +        pp = d.getVar('PP')
>> +        pps = d.getVar('PPS')
>> +        try:
>> +            runfetchcmd(f'''
>> +                set -e
>> +                schroot -r -c {session_id} -d / -u root -- \
>> +                    rm /etc/apt/sources.list.d/isar-apt.list
>> /etc/apt/preferences.d/isar-apt
>> +                schroot -r -c {session_id} -d / -- \
>> +                    sh -c '
>> +                        set -e
>> +                        dscfile=$(apt-get -y -qq --print-uris --
>> only-source source {ud.src_package} | \
>> +                                  cut -d " " -f2 | grep -E "\.dsc")
>> +                        cp /downloads/{ud.localfile}/* {pp}
>> +                        cd {pp}
>> +                        mv -f {pps} {pps}.prev
>> +                        dpkg-source -x "$dscfile" {pps}
> 
> Hello.
> 
> This still fails in CI, but this time I had some time to find the root
> cause.
> 
> The problem is that buster(bullseye) and bookworm(trixie) provide
> different versions of "hello" package.
> 
> If we first build e.g. `mc:qemuamd64-bookworm:hello`, hello_2.10-3.dsc
> is downloaded and the whole "downloads/deb-src/debian/hello/" is
> considered finished with "downloads/deb-src/debian/hello.done" flag. 
> 
> So, when e.g. `mc:qemuamd64-bullseye:hello` build follows, it doesn't
> download hello_2.10-2.dsc an results in dpkg-source error.
> 
> It doesn't matter if we build both targets in parallel or sequentially,
> the latest always fails.
> 

Thanks for the analysis. I'll check if I can reproduce und understand to
root cause.

Jan

-- 
Siemens AG, Technology
Linux Expert Center
Uladzimir Bely Nov. 28, 2024, 6:03 a.m. UTC | #3
On Thu, 2024-11-28 at 12:55 +0800, Jan Kiszka wrote:
> On 27.11.24 22:07, Uladzimir Bely wrote:
> > On Fri, 2024-11-15 at 17:40 +0100, Jan Kiszka wrote:
> > > From: Jan Kiszka <jan.kiszka@siemens.com>
> > > 
> > > There is no major functional difference, but we no longer have to
> > > manipulate SRC_URI by registering an official fetcher for apt://.
> > > 
> > > As the fetching no longer takes place in separate tasks, do_fetch
> > > and
> > > do_unpack need to gain the extra flags that were so far assigned
> > > to
> > > apt_fetch and apt_unpack. That happens conditionally, i.e. only
> > > if
> > > SRC_URI actually contains an apt URL.
> > > 
> > > One difference to the original version is the possibility - even
> > > if
> > > practically of minor relevance - to unpack multiple apt sources
> > > into
> > > S.
> > > The old version contained a loop but was directing dpkg-source to
> > > a
> > > pre-existing dir which would have failed on the second iteration.
> > > The
> > > new version now folds the results together after each step.
> > > 
> > > Another minor difference is that unversioned fetches put their
> > > results
> > > into the same subfolder in DL_DIR, also when specifying a distro
> > > codename. Only versioned fetches get dedicated folders (and .done
> > > stamps).
> > > 
> > > There is no progress report realized because dpkg-source
> > > unfortunately
> > > does not provide information upfront to make this predictable,
> > > thus
> > > expressible in form of percentage.
> > > 
> > > Signed-off-by: Jan Kiszka <jan.kiszka@siemens.com>
> > > ---
> > > 
> > > Changes in v2:
> > >  - rebased, including the removal of isar-apt sources in
> > > apt_unpack
> > > 
> > > I'm carefully optimistic that this change also resolves the
> > > previously 
> > > seen issue in CI.
> > > 
> > >  meta/classes/dpkg-base.bbclass | 104 ++++-----------------------
> > > ----
> > > --
> > >  meta/lib/aptsrc_fetcher.py     |  93
> > > +++++++++++++++++++++++++++++
> > >  2 files changed, 104 insertions(+), 93 deletions(-)
> > >  create mode 100644 meta/lib/aptsrc_fetcher.py
> > > 
> > > diff --git a/meta/classes/dpkg-base.bbclass b/meta/classes/dpkg-
> > > base.bbclass
> > > index b4ea8e17..c02c07a8 100644
> > > --- a/meta/classes/dpkg-base.bbclass
> > > +++ b/meta/classes/dpkg-base.bbclass
> > > @@ -79,110 +79,28 @@ do_adjust_git[lockfiles] +=
> > > "${DL_DIR}/git/isar.lock"
> > >  inherit patch
> > >  addtask patch after do_adjust_git
> > >  
> > > -SRC_APT ?= ""
> > > -
> > > -# filter out all "apt://" URIs out of SRC_URI and stick them
> > > into
> > > SRC_APT
> > >  python() {
> > > -    src_uri = (d.getVar('SRC_URI', False) or "").split()
> > > +    from bb.fetch2 import methods
> > >  
> > > -    prefix = "apt://"
> > > -    src_apt = []
> > > -    for u in src_uri:
> > > -        if u.startswith(prefix):
> > > -            src_apt.append(u[len(prefix) :])
> > > -            d.setVar('SRC_URI:remove', u)
> > > +    # apt-src fetcher
> > > +    import aptsrc_fetcher
> > > +    methods.append(aptsrc_fetcher.AptSrc())
> > >  
> > > -    d.prependVar('SRC_APT', ' '.join(src_apt))
> > > +    src_uri = (d.getVar('SRC_URI', False) or "").split()
> > > +    for u in src_uri:
> > > +        if u.startswith("apt://"):
> > > +            d.appendVarFlag('do_fetch', 'depends',
> > > d.getVar('SCHROOT_DEP'))
> > >  
> > > -    if len(d.getVar('SRC_APT').strip()) > 0:
> > > -        bb.build.addtask('apt_unpack', 'do_patch', '', d)
> > > -        bb.build.addtask('cleanall_apt', 'do_cleanall', '', d)
> > > +            d.appendVarFlag('do_unpack', 'cleandirs',
> > > d.getVar('S'))
> > > +            d.setVarFlag('do_unpack', 'network',
> > > d.getVar('TASK_USE_SUDO'))
> > > +            break
> > >  
> > >      # container docker fetcher
> > >      import container_fetcher
> > > -    from bb.fetch2 import methods
> > >  
> > >      methods.append(container_fetcher.Container())
> > >  }
> > >  
> > > -do_apt_fetch() {
> > > -    E="${@ isar_export_proxies(d)}"
> > > -    schroot_create_configs
> > > -
> > > -    session_id=$(schroot -q -b -c ${SBUILD_CHROOT})
> > > -    echo "Started session: ${session_id}"
> > > -
> > > -    schroot_cleanup() {
> > > -        schroot -q -f -e -c ${session_id} > /dev/null 2>&1
> > > -        schroot_delete_configs
> > > -    }
> > > -    trap 'exit 1' INT HUP QUIT TERM ALRM USR1
> > > -    trap 'schroot_cleanup' EXIT
> > > -
> > > -    schroot -r -c ${session_id} -d / -u root -- \
> > > -        rm /etc/apt/sources.list.d/isar-apt.list
> > > /etc/apt/preferences.d/isar-apt
> > > -    schroot -r -c ${session_id} -d / -- \
> > > -        sh -c '
> > > -            set -e
> > > -            for uri in $2; do
> > > -                mkdir -p /downloads/deb-src/"$1"/${uri}
> > > -                cd /downloads/deb-src/"$1"/${uri}
> > > -                apt-get -y --download-only --only-source source
> > > ${uri}
> > > -            done' \
> > > -                my_script "${BASE_DISTRO}-
> > > ${BASE_DISTRO_CODENAME}"
> > > "${SRC_APT}"
> > > -
> > > -    schroot -e -c ${session_id}
> > > -    schroot_delete_configs
> > > -}
> > > -
> > > -addtask apt_fetch
> > > -do_apt_fetch[lockfiles] += "${REPO_ISAR_DIR}/isar.lock"
> > > -do_apt_fetch[network] = "${TASK_USE_NETWORK_AND_SUDO}"
> > > -
> > > -# Add dependency from the correct schroot: host or target
> > > -do_apt_fetch[depends] += "${SCHROOT_DEP}"
> > > -
> > > -do_apt_unpack() {
> > > -    rm -rf ${S}
> > > -    schroot_create_configs
> > > -
> > > -    session_id=$(schroot -q -b -c ${SBUILD_CHROOT})
> > > -    echo "Started session: ${session_id}"
> > > -
> > > -    schroot_cleanup() {
> > > -        schroot -q -f -e -c ${session_id} > /dev/null 2>&1
> > > -        schroot_delete_configs
> > > -    }
> > > -    trap 'exit 1' INT HUP QUIT TERM ALRM USR1
> > > -    trap 'schroot_cleanup' EXIT
> > > -
> > > -    schroot -r -c ${session_id} -d / -u root -- \
> > > -        rm /etc/apt/sources.list.d/isar-apt.list
> > > /etc/apt/preferences.d/isar-apt
> > > -    schroot -r -c ${session_id} -d / -- \
> > > -        sh -c '
> > > -            set -e
> > > -            for uri in $2; do
> > > -                dscfile="$(apt-get -y -qq --print-uris --only-
> > > source
> > > source $uri | cut -d " " -f2 | grep -E "*.dsc")"
> > > -                cd ${PP}
> > > -                cp /downloads/deb-src/"${1}"/${uri}/* ${PP}
> > > -                dpkg-source -x "${dscfile}" "${PPS}"
> > > -            done' \
> > > -                my_script "${BASE_DISTRO}-
> > > ${BASE_DISTRO_CODENAME}"
> > > "${SRC_APT}"
> > > -
> > > -    schroot -e -c ${session_id}
> > > -    schroot_delete_configs
> > > -}
> > > -do_apt_unpack[network] = "${TASK_USE_SUDO}"
> > > -
> > > -addtask apt_unpack after do_apt_fetch
> > > -
> > > -do_cleanall_apt[nostamp] = "1"
> > > -do_cleanall_apt() {
> > > -    for uri in "${SRC_APT}"; do
> > > -        rm -rf "${DEBSRCDIR}/${BASE_DISTRO}-
> > > ${BASE_DISTRO_CODENAME}/$uri"
> > > -    done
> > > -}
> > > -
> > >  def get_package_srcdir(d):
> > >      s = os.path.abspath(d.getVar("S"))
> > >      workdir = os.path.abspath(d.getVar("WORKDIR"))
> > > diff --git a/meta/lib/aptsrc_fetcher.py
> > > b/meta/lib/aptsrc_fetcher.py
> > > new file mode 100644
> > > index 00000000..ee726202
> > > --- /dev/null
> > > +++ b/meta/lib/aptsrc_fetcher.py
> > > @@ -0,0 +1,93 @@
> > > +# This software is a part of ISAR.
> > > +# Copyright (c) Siemens AG, 2024
> > > +#
> > > +# SPDX-License-Identifier: MIT
> > > +
> > > +from bb.fetch2 import FetchError
> > > +from bb.fetch2 import FetchMethod
> > > +from bb.fetch2 import logger
> > > +from bb.fetch2 import runfetchcmd
> > > +
> > > +class AptSrc(FetchMethod):
> > > +    def supports(self, ud, d):
> > > +        return ud.type in ['apt']
> > > +
> > > +    def urldata_init(self, ud, d):
> > > +        ud.src_package = ud.url[len('apt://'):]
> > > +        ud.host = ud.host.replace('=', '_')
> > > +
> > > +        base_distro = d.getVar('BASE_DISTRO')
> > > +
> > > +        # For these distros we know that the same version means
> > > the
> > > same
> > > +        # source package, also across distro releases.
> > > +        distro_suffix = '' if base_distro in ['debian',
> > > 'ubuntu']
> > > else \
> > > +            '-' + d.getVar('BASE_DISTRO_CODENAME')
> > > +
> > > +        ud.localfile='deb-src/' + base_distro + distro_suffix +
> > > '/'
> > > + ud.host
> > > +
> > > +    def download(self, ud, d):
> > > +        bb.utils.exec_flat_python_func('isar_export_proxies', d)
> > > +        bb.build.exec_func('schroot_create_configs', d)
> > > +
> > > +        sbuild_chroot = d.getVar('SBUILD_CHROOT')
> > > +        session_id = runfetchcmd(f'schroot -q -b -c
> > > {sbuild_chroot}', d).strip()
> > > +        logger.info(f'Started session: {session_id}')
> > > +
> > > +        repo_isar_dir = d.getVar('REPO_ISAR_DIR')
> > > +        lockfile =
> > > bb.utils.lockfile(f'{repo_isar_dir}/isar.lock')
> > > +
> > > +        try:
> > > +            runfetchcmd(f'''
> > > +                set -e
> > > +                schroot -r -c {session_id} -d / -u root -- \
> > > +                    rm /etc/apt/sources.list.d/isar-apt.list
> > > /etc/apt/preferences.d/isar-apt
> > > +                schroot -r -c {session_id} -d / -- \
> > > +                    sh -c '
> > > +                        set -e
> > > +                        mkdir -p /downloads/{ud.localfile}
> > > +                        cd /downloads/{ud.localfile}
> > > +                        apt-get -y --download-only --only-source
> > > source {ud.src_package}
> > > +                        '
> > > +                ''', d)
> > > +        except (OSError, FetchError):
> > > +            raise
> > > +        finally:
> > > +            bb.utils.unlockfile(lockfile)
> > > +            runfetchcmd(f'schroot -q -f -e -c {session_id}', d)
> > > +            bb.build.exec_func('schroot_delete_configs', d)
> > > +
> > > +    def unpack(self, ud, rootdir, d):
> > > +        bb.build.exec_func('schroot_create_configs', d)
> > > +
> > > +        sbuild_chroot = d.getVar('SBUILD_CHROOT')
> > > +        session_id = runfetchcmd(f'schroot -q -b -c
> > > {sbuild_chroot}', d).strip()
> > > +        logger.info(f'Started session: {session_id}')
> > > +
> > > +        pp = d.getVar('PP')
> > > +        pps = d.getVar('PPS')
> > > +        try:
> > > +            runfetchcmd(f'''
> > > +                set -e
> > > +                schroot -r -c {session_id} -d / -u root -- \
> > > +                    rm /etc/apt/sources.list.d/isar-apt.list
> > > /etc/apt/preferences.d/isar-apt
> > > +                schroot -r -c {session_id} -d / -- \
> > > +                    sh -c '
> > > +                        set -e
> > > +                        dscfile=$(apt-get -y -qq --print-uris --
> > > only-source source {ud.src_package} | \
> > > +                                  cut -d " " -f2 | grep -E
> > > "\.dsc")
> > > +                        cp /downloads/{ud.localfile}/* {pp}
> > > +                        cd {pp}
> > > +                        mv -f {pps} {pps}.prev
> > > +                        dpkg-source -x "$dscfile" {pps}
> > 
> > Hello.
> > 
> > This still fails in CI, but this time I had some time to find the
> > root
> > cause.
> > 
> > The problem is that buster(bullseye) and bookworm(trixie) provide
> > different versions of "hello" package.
> > 
> > If we first build e.g. `mc:qemuamd64-bookworm:hello`, hello_2.10-
> > 3.dsc
> > is downloaded and the whole "downloads/deb-src/debian/hello/" is
> > considered finished with "downloads/deb-src/debian/hello.done"
> > flag. 
> > 
> > So, when e.g. `mc:qemuamd64-bullseye:hello` build follows, it
> > doesn't
> > download hello_2.10-2.dsc an results in dpkg-source error.
> > 
> > It doesn't matter if we build both targets in parallel or
> > sequentially,
> > the latest always fails.
> > 
> 
> Thanks for the analysis. I'll check if I can reproduce und understand
> to
> root cause.
> 
> Jan
> 

The easy way to reproduce:

./kas/kas-container menu # select e.g. qemuamd64-bookworm, save & exit
./kas/kas-container shell -c 'bitbake hello'
./kas/kas-container menu # select e.g. qemuamd64-bullseye, save & exit
./kas/kas-container shell -c 'bitbake hello'
Uladzimir Bely Nov. 28, 2024, 6:23 a.m. UTC | #4
On Thu, 2024-11-28 at 09:03 +0300, Uladzimir Bely wrote:
> On Thu, 2024-11-28 at 12:55 +0800, Jan Kiszka wrote:
> > On 27.11.24 22:07, Uladzimir Bely wrote:
> > > On Fri, 2024-11-15 at 17:40 +0100, Jan Kiszka wrote:
> > > > From: Jan Kiszka <jan.kiszka@siemens.com>
> > > > 
> > > > There is no major functional difference, but we no longer have
> > > > to
> > > > manipulate SRC_URI by registering an official fetcher for
> > > > apt://.
> > > > 
> > > > As the fetching no longer takes place in separate tasks,
> > > > do_fetch
> > > > and
> > > > do_unpack need to gain the extra flags that were so far
> > > > assigned
> > > > to
> > > > apt_fetch and apt_unpack. That happens conditionally, i.e. only
> > > > if
> > > > SRC_URI actually contains an apt URL.
> > > > 
> > > > One difference to the original version is the possibility -
> > > > even
> > > > if
> > > > practically of minor relevance - to unpack multiple apt sources
> > > > into
> > > > S.
> > > > The old version contained a loop but was directing dpkg-source
> > > > to
> > > > a
> > > > pre-existing dir which would have failed on the second
> > > > iteration.
> > > > The
> > > > new version now folds the results together after each step.
> > > > 
> > > > Another minor difference is that unversioned fetches put their
> > > > results
> > > > into the same subfolder in DL_DIR, also when specifying a
> > > > distro
> > > > codename. Only versioned fetches get dedicated folders (and
> > > > .done
> > > > stamps).
> > > > 
> > > > There is no progress report realized because dpkg-source
> > > > unfortunately
> > > > does not provide information upfront to make this predictable,
> > > > thus
> > > > expressible in form of percentage.
> > > > 
> > > > Signed-off-by: Jan Kiszka <jan.kiszka@siemens.com>
> > > > ---
> > > > 
> > > > Changes in v2:
> > > >  - rebased, including the removal of isar-apt sources in
> > > > apt_unpack
> > > > 
> > > > I'm carefully optimistic that this change also resolves the
> > > > previously 
> > > > seen issue in CI.
> > > > 
> > > >  meta/classes/dpkg-base.bbclass | 104 ++++---------------------
> > > > --
> > > > ----
> > > > --
> > > >  meta/lib/aptsrc_fetcher.py     |  93
> > > > +++++++++++++++++++++++++++++
> > > >  2 files changed, 104 insertions(+), 93 deletions(-)
> > > >  create mode 100644 meta/lib/aptsrc_fetcher.py
> > > > 
> > > > diff --git a/meta/classes/dpkg-base.bbclass
> > > > b/meta/classes/dpkg-
> > > > base.bbclass
> > > > index b4ea8e17..c02c07a8 100644
> > > > --- a/meta/classes/dpkg-base.bbclass
> > > > +++ b/meta/classes/dpkg-base.bbclass
> > > > @@ -79,110 +79,28 @@ do_adjust_git[lockfiles] +=
> > > > "${DL_DIR}/git/isar.lock"
> > > >  inherit patch
> > > >  addtask patch after do_adjust_git
> > > >  
> > > > -SRC_APT ?= ""
> > > > -
> > > > -# filter out all "apt://" URIs out of SRC_URI and stick them
> > > > into
> > > > SRC_APT
> > > >  python() {
> > > > -    src_uri = (d.getVar('SRC_URI', False) or "").split()
> > > > +    from bb.fetch2 import methods
> > > >  
> > > > -    prefix = "apt://"
> > > > -    src_apt = []
> > > > -    for u in src_uri:
> > > > -        if u.startswith(prefix):
> > > > -            src_apt.append(u[len(prefix) :])
> > > > -            d.setVar('SRC_URI:remove', u)
> > > > +    # apt-src fetcher
> > > > +    import aptsrc_fetcher
> > > > +    methods.append(aptsrc_fetcher.AptSrc())
> > > >  
> > > > -    d.prependVar('SRC_APT', ' '.join(src_apt))
> > > > +    src_uri = (d.getVar('SRC_URI', False) or "").split()
> > > > +    for u in src_uri:
> > > > +        if u.startswith("apt://"):
> > > > +            d.appendVarFlag('do_fetch', 'depends',
> > > > d.getVar('SCHROOT_DEP'))
> > > >  
> > > > -    if len(d.getVar('SRC_APT').strip()) > 0:
> > > > -        bb.build.addtask('apt_unpack', 'do_patch', '', d)
> > > > -        bb.build.addtask('cleanall_apt', 'do_cleanall', '', d)
> > > > +            d.appendVarFlag('do_unpack', 'cleandirs',
> > > > d.getVar('S'))
> > > > +            d.setVarFlag('do_unpack', 'network',
> > > > d.getVar('TASK_USE_SUDO'))
> > > > +            break
> > > >  
> > > >      # container docker fetcher
> > > >      import container_fetcher
> > > > -    from bb.fetch2 import methods
> > > >  
> > > >      methods.append(container_fetcher.Container())
> > > >  }
> > > >  
> > > > -do_apt_fetch() {
> > > > -    E="${@ isar_export_proxies(d)}"
> > > > -    schroot_create_configs
> > > > -
> > > > -    session_id=$(schroot -q -b -c ${SBUILD_CHROOT})
> > > > -    echo "Started session: ${session_id}"
> > > > -
> > > > -    schroot_cleanup() {
> > > > -        schroot -q -f -e -c ${session_id} > /dev/null 2>&1
> > > > -        schroot_delete_configs
> > > > -    }
> > > > -    trap 'exit 1' INT HUP QUIT TERM ALRM USR1
> > > > -    trap 'schroot_cleanup' EXIT
> > > > -
> > > > -    schroot -r -c ${session_id} -d / -u root -- \
> > > > -        rm /etc/apt/sources.list.d/isar-apt.list
> > > > /etc/apt/preferences.d/isar-apt
> > > > -    schroot -r -c ${session_id} -d / -- \
> > > > -        sh -c '
> > > > -            set -e
> > > > -            for uri in $2; do
> > > > -                mkdir -p /downloads/deb-src/"$1"/${uri}
> > > > -                cd /downloads/deb-src/"$1"/${uri}
> > > > -                apt-get -y --download-only --only-source
> > > > source
> > > > ${uri}
> > > > -            done' \
> > > > -                my_script "${BASE_DISTRO}-
> > > > ${BASE_DISTRO_CODENAME}"
> > > > "${SRC_APT}"
> > > > -
> > > > -    schroot -e -c ${session_id}
> > > > -    schroot_delete_configs
> > > > -}
> > > > -
> > > > -addtask apt_fetch
> > > > -do_apt_fetch[lockfiles] += "${REPO_ISAR_DIR}/isar.lock"
> > > > -do_apt_fetch[network] = "${TASK_USE_NETWORK_AND_SUDO}"
> > > > -
> > > > -# Add dependency from the correct schroot: host or target
> > > > -do_apt_fetch[depends] += "${SCHROOT_DEP}"
> > > > -
> > > > -do_apt_unpack() {
> > > > -    rm -rf ${S}
> > > > -    schroot_create_configs
> > > > -
> > > > -    session_id=$(schroot -q -b -c ${SBUILD_CHROOT})
> > > > -    echo "Started session: ${session_id}"
> > > > -
> > > > -    schroot_cleanup() {
> > > > -        schroot -q -f -e -c ${session_id} > /dev/null 2>&1
> > > > -        schroot_delete_configs
> > > > -    }
> > > > -    trap 'exit 1' INT HUP QUIT TERM ALRM USR1
> > > > -    trap 'schroot_cleanup' EXIT
> > > > -
> > > > -    schroot -r -c ${session_id} -d / -u root -- \
> > > > -        rm /etc/apt/sources.list.d/isar-apt.list
> > > > /etc/apt/preferences.d/isar-apt
> > > > -    schroot -r -c ${session_id} -d / -- \
> > > > -        sh -c '
> > > > -            set -e
> > > > -            for uri in $2; do
> > > > -                dscfile="$(apt-get -y -qq --print-uris --only-
> > > > source
> > > > source $uri | cut -d " " -f2 | grep -E "*.dsc")"
> > > > -                cd ${PP}
> > > > -                cp /downloads/deb-src/"${1}"/${uri}/* ${PP}
> > > > -                dpkg-source -x "${dscfile}" "${PPS}"
> > > > -            done' \
> > > > -                my_script "${BASE_DISTRO}-
> > > > ${BASE_DISTRO_CODENAME}"
> > > > "${SRC_APT}"
> > > > -
> > > > -    schroot -e -c ${session_id}
> > > > -    schroot_delete_configs
> > > > -}
> > > > -do_apt_unpack[network] = "${TASK_USE_SUDO}"
> > > > -
> > > > -addtask apt_unpack after do_apt_fetch
> > > > -
> > > > -do_cleanall_apt[nostamp] = "1"
> > > > -do_cleanall_apt() {
> > > > -    for uri in "${SRC_APT}"; do
> > > > -        rm -rf "${DEBSRCDIR}/${BASE_DISTRO}-
> > > > ${BASE_DISTRO_CODENAME}/$uri"
> > > > -    done
> > > > -}
> > > > -
> > > >  def get_package_srcdir(d):
> > > >      s = os.path.abspath(d.getVar("S"))
> > > >      workdir = os.path.abspath(d.getVar("WORKDIR"))
> > > > diff --git a/meta/lib/aptsrc_fetcher.py
> > > > b/meta/lib/aptsrc_fetcher.py
> > > > new file mode 100644
> > > > index 00000000..ee726202
> > > > --- /dev/null
> > > > +++ b/meta/lib/aptsrc_fetcher.py
> > > > @@ -0,0 +1,93 @@
> > > > +# This software is a part of ISAR.
> > > > +# Copyright (c) Siemens AG, 2024
> > > > +#
> > > > +# SPDX-License-Identifier: MIT
> > > > +
> > > > +from bb.fetch2 import FetchError
> > > > +from bb.fetch2 import FetchMethod
> > > > +from bb.fetch2 import logger
> > > > +from bb.fetch2 import runfetchcmd
> > > > +
> > > > +class AptSrc(FetchMethod):
> > > > +    def supports(self, ud, d):
> > > > +        return ud.type in ['apt']
> > > > +
> > > > +    def urldata_init(self, ud, d):
> > > > +        ud.src_package = ud.url[len('apt://'):]
> > > > +        ud.host = ud.host.replace('=', '_')
> > > > +
> > > > +        base_distro = d.getVar('BASE_DISTRO')
> > > > +
> > > > +        # For these distros we know that the same version
> > > > means
> > > > the
> > > > same
> > > > +        # source package, also across distro releases.
> > > > +        distro_suffix = '' if base_distro in ['debian',
> > > > 'ubuntu']
> > > > else \
> > > > +            '-' + d.getVar('BASE_DISTRO_CODENAME')

I think, to avoid the issue I mentioned, we should continue using
${BASE_DISTRO}-${BASE_DISTRO_CODENAME} here without exceptions.

Also, cache_deb_src() function in rootfs.bbclass still uses this
location.

> > > > +
> > > > +        ud.localfile='deb-src/' + base_distro + distro_suffix
> > > > +
> > > > '/'
> > > > + ud.host
> > > > +
> > > > +    def download(self, ud, d):
> > > > +        bb.utils.exec_flat_python_func('isar_export_proxies',
> > > > d)
> > > > +        bb.build.exec_func('schroot_create_configs', d)
> > > > +
> > > > +        sbuild_chroot = d.getVar('SBUILD_CHROOT')
> > > > +        session_id = runfetchcmd(f'schroot -q -b -c
> > > > {sbuild_chroot}', d).strip()
> > > > +        logger.info(f'Started session: {session_id}')
> > > > +
> > > > +        repo_isar_dir = d.getVar('REPO_ISAR_DIR')
> > > > +        lockfile =
> > > > bb.utils.lockfile(f'{repo_isar_dir}/isar.lock')
> > > > +
> > > > +        try:
> > > > +            runfetchcmd(f'''
> > > > +                set -e
> > > > +                schroot -r -c {session_id} -d / -u root -- \
> > > > +                    rm /etc/apt/sources.list.d/isar-apt.list
> > > > /etc/apt/preferences.d/isar-apt
> > > > +                schroot -r -c {session_id} -d / -- \
> > > > +                    sh -c '
> > > > +                        set -e
> > > > +                        mkdir -p /downloads/{ud.localfile}
> > > > +                        cd /downloads/{ud.localfile}
> > > > +                        apt-get -y --download-only --only-
> > > > source
> > > > source {ud.src_package}
> > > > +                        '
> > > > +                ''', d)
> > > > +        except (OSError, FetchError):
> > > > +            raise
> > > > +        finally:
> > > > +            bb.utils.unlockfile(lockfile)
> > > > +            runfetchcmd(f'schroot -q -f -e -c {session_id}',
> > > > d)
> > > > +            bb.build.exec_func('schroot_delete_configs', d)
> > > > +
> > > > +    def unpack(self, ud, rootdir, d):
> > > > +        bb.build.exec_func('schroot_create_configs', d)
> > > > +
> > > > +        sbuild_chroot = d.getVar('SBUILD_CHROOT')
> > > > +        session_id = runfetchcmd(f'schroot -q -b -c
> > > > {sbuild_chroot}', d).strip()
> > > > +        logger.info(f'Started session: {session_id}')
> > > > +
> > > > +        pp = d.getVar('PP')
> > > > +        pps = d.getVar('PPS')
> > > > +        try:
> > > > +            runfetchcmd(f'''
> > > > +                set -e
> > > > +                schroot -r -c {session_id} -d / -u root -- \
> > > > +                    rm /etc/apt/sources.list.d/isar-apt.list
> > > > /etc/apt/preferences.d/isar-apt
> > > > +                schroot -r -c {session_id} -d / -- \
> > > > +                    sh -c '
> > > > +                        set -e
> > > > +                        dscfile=$(apt-get -y -qq --print-uris
> > > > --
> > > > only-source source {ud.src_package} | \
> > > > +                                  cut -d " " -f2 | grep -E
> > > > "\.dsc")
> > > > +                        cp /downloads/{ud.localfile}/* {pp}
> > > > +                        cd {pp}
> > > > +                        mv -f {pps} {pps}.prev
> > > > +                        dpkg-source -x "$dscfile" {pps}
> > > 
> > > Hello.
> > > 
> > > This still fails in CI, but this time I had some time to find the
> > > root
> > > cause.
> > > 
> > > The problem is that buster(bullseye) and bookworm(trixie) provide
> > > different versions of "hello" package.
> > > 
> > > If we first build e.g. `mc:qemuamd64-bookworm:hello`, hello_2.10-
> > > 3.dsc
> > > is downloaded and the whole "downloads/deb-src/debian/hello/" is
> > > considered finished with "downloads/deb-src/debian/hello.done"
> > > flag. 
> > > 
> > > So, when e.g. `mc:qemuamd64-bullseye:hello` build follows, it
> > > doesn't
> > > download hello_2.10-2.dsc an results in dpkg-source error.
> > > 
> > > It doesn't matter if we build both targets in parallel or
> > > sequentially,
> > > the latest always fails.
> > > 
> > 
> > Thanks for the analysis. I'll check if I can reproduce und
> > understand
> > to
> > root cause.
> > 
> > Jan
> > 
> 
> The easy way to reproduce:
> 
> ./kas/kas-container menu # select e.g. qemuamd64-bookworm, save &
> exit
> ./kas/kas-container shell -c 'bitbake hello'
> ./kas/kas-container menu # select e.g. qemuamd64-bullseye, save &
> exit
> ./kas/kas-container shell -c 'bitbake hello'
> 
> -- 
> Best regards,
> Uladzimir.
>
Jan Kiszka Nov. 29, 2024, 11:42 a.m. UTC | #5
On 28.11.24 14:23, Uladzimir Bely wrote:
> On Thu, 2024-11-28 at 09:03 +0300, Uladzimir Bely wrote:
>> On Thu, 2024-11-28 at 12:55 +0800, Jan Kiszka wrote:
>>> On 27.11.24 22:07, Uladzimir Bely wrote:
>>>> On Fri, 2024-11-15 at 17:40 +0100, Jan Kiszka wrote:
>>>>> From: Jan Kiszka <jan.kiszka@siemens.com>
>>>>>
>>>>> There is no major functional difference, but we no longer have
>>>>> to
>>>>> manipulate SRC_URI by registering an official fetcher for
>>>>> apt://.
>>>>>
>>>>> As the fetching no longer takes place in separate tasks,
>>>>> do_fetch
>>>>> and
>>>>> do_unpack need to gain the extra flags that were so far
>>>>> assigned
>>>>> to
>>>>> apt_fetch and apt_unpack. That happens conditionally, i.e. only
>>>>> if
>>>>> SRC_URI actually contains an apt URL.
>>>>>
>>>>> One difference to the original version is the possibility -
>>>>> even
>>>>> if
>>>>> practically of minor relevance - to unpack multiple apt sources
>>>>> into
>>>>> S.
>>>>> The old version contained a loop but was directing dpkg-source
>>>>> to
>>>>> a
>>>>> pre-existing dir which would have failed on the second
>>>>> iteration.
>>>>> The
>>>>> new version now folds the results together after each step.
>>>>>
>>>>> Another minor difference is that unversioned fetches put their
>>>>> results
>>>>> into the same subfolder in DL_DIR, also when specifying a
>>>>> distro
>>>>> codename. Only versioned fetches get dedicated folders (and
>>>>> .done
>>>>> stamps).
>>>>>
>>>>> There is no progress report realized because dpkg-source
>>>>> unfortunately
>>>>> does not provide information upfront to make this predictable,
>>>>> thus
>>>>> expressible in form of percentage.
>>>>>
>>>>> Signed-off-by: Jan Kiszka <jan.kiszka@siemens.com>
>>>>> ---
>>>>>
>>>>> Changes in v2:
>>>>>  - rebased, including the removal of isar-apt sources in
>>>>> apt_unpack
>>>>>
>>>>> I'm carefully optimistic that this change also resolves the
>>>>> previously 
>>>>> seen issue in CI.
>>>>>
>>>>>  meta/classes/dpkg-base.bbclass | 104 ++++---------------------
>>>>> --
>>>>> ----
>>>>> --
>>>>>  meta/lib/aptsrc_fetcher.py     |  93
>>>>> +++++++++++++++++++++++++++++
>>>>>  2 files changed, 104 insertions(+), 93 deletions(-)
>>>>>  create mode 100644 meta/lib/aptsrc_fetcher.py
>>>>>
>>>>> diff --git a/meta/classes/dpkg-base.bbclass
>>>>> b/meta/classes/dpkg-
>>>>> base.bbclass
>>>>> index b4ea8e17..c02c07a8 100644
>>>>> --- a/meta/classes/dpkg-base.bbclass
>>>>> +++ b/meta/classes/dpkg-base.bbclass
>>>>> @@ -79,110 +79,28 @@ do_adjust_git[lockfiles] +=
>>>>> "${DL_DIR}/git/isar.lock"
>>>>>  inherit patch
>>>>>  addtask patch after do_adjust_git
>>>>>  
>>>>> -SRC_APT ?= ""
>>>>> -
>>>>> -# filter out all "apt://" URIs out of SRC_URI and stick them
>>>>> into
>>>>> SRC_APT
>>>>>  python() {
>>>>> -    src_uri = (d.getVar('SRC_URI', False) or "").split()
>>>>> +    from bb.fetch2 import methods
>>>>>  
>>>>> -    prefix = "apt://"
>>>>> -    src_apt = []
>>>>> -    for u in src_uri:
>>>>> -        if u.startswith(prefix):
>>>>> -            src_apt.append(u[len(prefix) :])
>>>>> -            d.setVar('SRC_URI:remove', u)
>>>>> +    # apt-src fetcher
>>>>> +    import aptsrc_fetcher
>>>>> +    methods.append(aptsrc_fetcher.AptSrc())
>>>>>  
>>>>> -    d.prependVar('SRC_APT', ' '.join(src_apt))
>>>>> +    src_uri = (d.getVar('SRC_URI', False) or "").split()
>>>>> +    for u in src_uri:
>>>>> +        if u.startswith("apt://"):
>>>>> +            d.appendVarFlag('do_fetch', 'depends',
>>>>> d.getVar('SCHROOT_DEP'))
>>>>>  
>>>>> -    if len(d.getVar('SRC_APT').strip()) > 0:
>>>>> -        bb.build.addtask('apt_unpack', 'do_patch', '', d)
>>>>> -        bb.build.addtask('cleanall_apt', 'do_cleanall', '', d)
>>>>> +            d.appendVarFlag('do_unpack', 'cleandirs',
>>>>> d.getVar('S'))
>>>>> +            d.setVarFlag('do_unpack', 'network',
>>>>> d.getVar('TASK_USE_SUDO'))
>>>>> +            break
>>>>>  
>>>>>      # container docker fetcher
>>>>>      import container_fetcher
>>>>> -    from bb.fetch2 import methods
>>>>>  
>>>>>      methods.append(container_fetcher.Container())
>>>>>  }
>>>>>  
>>>>> -do_apt_fetch() {
>>>>> -    E="${@ isar_export_proxies(d)}"
>>>>> -    schroot_create_configs
>>>>> -
>>>>> -    session_id=$(schroot -q -b -c ${SBUILD_CHROOT})
>>>>> -    echo "Started session: ${session_id}"
>>>>> -
>>>>> -    schroot_cleanup() {
>>>>> -        schroot -q -f -e -c ${session_id} > /dev/null 2>&1
>>>>> -        schroot_delete_configs
>>>>> -    }
>>>>> -    trap 'exit 1' INT HUP QUIT TERM ALRM USR1
>>>>> -    trap 'schroot_cleanup' EXIT
>>>>> -
>>>>> -    schroot -r -c ${session_id} -d / -u root -- \
>>>>> -        rm /etc/apt/sources.list.d/isar-apt.list
>>>>> /etc/apt/preferences.d/isar-apt
>>>>> -    schroot -r -c ${session_id} -d / -- \
>>>>> -        sh -c '
>>>>> -            set -e
>>>>> -            for uri in $2; do
>>>>> -                mkdir -p /downloads/deb-src/"$1"/${uri}
>>>>> -                cd /downloads/deb-src/"$1"/${uri}
>>>>> -                apt-get -y --download-only --only-source
>>>>> source
>>>>> ${uri}
>>>>> -            done' \
>>>>> -                my_script "${BASE_DISTRO}-
>>>>> ${BASE_DISTRO_CODENAME}"
>>>>> "${SRC_APT}"
>>>>> -
>>>>> -    schroot -e -c ${session_id}
>>>>> -    schroot_delete_configs
>>>>> -}
>>>>> -
>>>>> -addtask apt_fetch
>>>>> -do_apt_fetch[lockfiles] += "${REPO_ISAR_DIR}/isar.lock"
>>>>> -do_apt_fetch[network] = "${TASK_USE_NETWORK_AND_SUDO}"
>>>>> -
>>>>> -# Add dependency from the correct schroot: host or target
>>>>> -do_apt_fetch[depends] += "${SCHROOT_DEP}"
>>>>> -
>>>>> -do_apt_unpack() {
>>>>> -    rm -rf ${S}
>>>>> -    schroot_create_configs
>>>>> -
>>>>> -    session_id=$(schroot -q -b -c ${SBUILD_CHROOT})
>>>>> -    echo "Started session: ${session_id}"
>>>>> -
>>>>> -    schroot_cleanup() {
>>>>> -        schroot -q -f -e -c ${session_id} > /dev/null 2>&1
>>>>> -        schroot_delete_configs
>>>>> -    }
>>>>> -    trap 'exit 1' INT HUP QUIT TERM ALRM USR1
>>>>> -    trap 'schroot_cleanup' EXIT
>>>>> -
>>>>> -    schroot -r -c ${session_id} -d / -u root -- \
>>>>> -        rm /etc/apt/sources.list.d/isar-apt.list
>>>>> /etc/apt/preferences.d/isar-apt
>>>>> -    schroot -r -c ${session_id} -d / -- \
>>>>> -        sh -c '
>>>>> -            set -e
>>>>> -            for uri in $2; do
>>>>> -                dscfile="$(apt-get -y -qq --print-uris --only-
>>>>> source
>>>>> source $uri | cut -d " " -f2 | grep -E "*.dsc")"
>>>>> -                cd ${PP}
>>>>> -                cp /downloads/deb-src/"${1}"/${uri}/* ${PP}
>>>>> -                dpkg-source -x "${dscfile}" "${PPS}"
>>>>> -            done' \
>>>>> -                my_script "${BASE_DISTRO}-
>>>>> ${BASE_DISTRO_CODENAME}"
>>>>> "${SRC_APT}"
>>>>> -
>>>>> -    schroot -e -c ${session_id}
>>>>> -    schroot_delete_configs
>>>>> -}
>>>>> -do_apt_unpack[network] = "${TASK_USE_SUDO}"
>>>>> -
>>>>> -addtask apt_unpack after do_apt_fetch
>>>>> -
>>>>> -do_cleanall_apt[nostamp] = "1"
>>>>> -do_cleanall_apt() {
>>>>> -    for uri in "${SRC_APT}"; do
>>>>> -        rm -rf "${DEBSRCDIR}/${BASE_DISTRO}-
>>>>> ${BASE_DISTRO_CODENAME}/$uri"
>>>>> -    done
>>>>> -}
>>>>> -
>>>>>  def get_package_srcdir(d):
>>>>>      s = os.path.abspath(d.getVar("S"))
>>>>>      workdir = os.path.abspath(d.getVar("WORKDIR"))
>>>>> diff --git a/meta/lib/aptsrc_fetcher.py
>>>>> b/meta/lib/aptsrc_fetcher.py
>>>>> new file mode 100644
>>>>> index 00000000..ee726202
>>>>> --- /dev/null
>>>>> +++ b/meta/lib/aptsrc_fetcher.py
>>>>> @@ -0,0 +1,93 @@
>>>>> +# This software is a part of ISAR.
>>>>> +# Copyright (c) Siemens AG, 2024
>>>>> +#
>>>>> +# SPDX-License-Identifier: MIT
>>>>> +
>>>>> +from bb.fetch2 import FetchError
>>>>> +from bb.fetch2 import FetchMethod
>>>>> +from bb.fetch2 import logger
>>>>> +from bb.fetch2 import runfetchcmd
>>>>> +
>>>>> +class AptSrc(FetchMethod):
>>>>> +    def supports(self, ud, d):
>>>>> +        return ud.type in ['apt']
>>>>> +
>>>>> +    def urldata_init(self, ud, d):
>>>>> +        ud.src_package = ud.url[len('apt://'):]
>>>>> +        ud.host = ud.host.replace('=', '_')
>>>>> +
>>>>> +        base_distro = d.getVar('BASE_DISTRO')
>>>>> +
>>>>> +        # For these distros we know that the same version
>>>>> means
>>>>> the
>>>>> same
>>>>> +        # source package, also across distro releases.
>>>>> +        distro_suffix = '' if base_distro in ['debian',
>>>>> 'ubuntu']
>>>>> else \
>>>>> +            '-' + d.getVar('BASE_DISTRO_CODENAME')
> 
> I think, to avoid the issue I mentioned, we should continue using
> ${BASE_DISTRO}-${BASE_DISTRO_CODENAME} here without exceptions.
> 

Yeah, the main issue is that we cannot predict the actual version that
will be requested and that we therefore end up with a single,
unversioned .done file for the fetching tasks. We would have to retrieve
the version on every run in order to decide if to re-run the fetch - not
simple and likely not worth it. Will change back to the existing path.

> Also, cache_deb_src() function in rootfs.bbclass still uses this
> location.

Right, that would be easier fixable, though.

Thanks,
Jan
Jan Kiszka Nov. 29, 2024, 11:53 a.m. UTC | #6
On 28.11.24 14:03, Uladzimir Bely wrote:
> The easy way to reproduce:
> 
> ./kas/kas-container menu # select e.g. qemuamd64-bookworm, save & exit
> ./kas/kas-container shell -c 'bitbake hello'

JFYI, this is generally more handy:

./kas/kas-container build --target hello

Jan

> ./kas/kas-container menu # select e.g. qemuamd64-bullseye, save & exit
> ./kas/kas-container shell -c 'bitbake hello'
>

Patch

diff --git a/meta/classes/dpkg-base.bbclass b/meta/classes/dpkg-base.bbclass
index b4ea8e17..c02c07a8 100644
--- a/meta/classes/dpkg-base.bbclass
+++ b/meta/classes/dpkg-base.bbclass
@@ -79,110 +79,28 @@  do_adjust_git[lockfiles] += "${DL_DIR}/git/isar.lock"
 inherit patch
 addtask patch after do_adjust_git
 
-SRC_APT ?= ""
-
-# filter out all "apt://" URIs out of SRC_URI and stick them into SRC_APT
 python() {
-    src_uri = (d.getVar('SRC_URI', False) or "").split()
+    from bb.fetch2 import methods
 
-    prefix = "apt://"
-    src_apt = []
-    for u in src_uri:
-        if u.startswith(prefix):
-            src_apt.append(u[len(prefix) :])
-            d.setVar('SRC_URI:remove', u)
+    # apt-src fetcher
+    import aptsrc_fetcher
+    methods.append(aptsrc_fetcher.AptSrc())
 
-    d.prependVar('SRC_APT', ' '.join(src_apt))
+    src_uri = (d.getVar('SRC_URI', False) or "").split()
+    for u in src_uri:
+        if u.startswith("apt://"):
+            d.appendVarFlag('do_fetch', 'depends', d.getVar('SCHROOT_DEP'))
 
-    if len(d.getVar('SRC_APT').strip()) > 0:
-        bb.build.addtask('apt_unpack', 'do_patch', '', d)
-        bb.build.addtask('cleanall_apt', 'do_cleanall', '', d)
+            d.appendVarFlag('do_unpack', 'cleandirs', d.getVar('S'))
+            d.setVarFlag('do_unpack', 'network', d.getVar('TASK_USE_SUDO'))
+            break
 
     # container docker fetcher
     import container_fetcher
-    from bb.fetch2 import methods
 
     methods.append(container_fetcher.Container())
 }
 
-do_apt_fetch() {
-    E="${@ isar_export_proxies(d)}"
-    schroot_create_configs
-
-    session_id=$(schroot -q -b -c ${SBUILD_CHROOT})
-    echo "Started session: ${session_id}"
-
-    schroot_cleanup() {
-        schroot -q -f -e -c ${session_id} > /dev/null 2>&1
-        schroot_delete_configs
-    }
-    trap 'exit 1' INT HUP QUIT TERM ALRM USR1
-    trap 'schroot_cleanup' EXIT
-
-    schroot -r -c ${session_id} -d / -u root -- \
-        rm /etc/apt/sources.list.d/isar-apt.list /etc/apt/preferences.d/isar-apt
-    schroot -r -c ${session_id} -d / -- \
-        sh -c '
-            set -e
-            for uri in $2; do
-                mkdir -p /downloads/deb-src/"$1"/${uri}
-                cd /downloads/deb-src/"$1"/${uri}
-                apt-get -y --download-only --only-source source ${uri}
-            done' \
-                my_script "${BASE_DISTRO}-${BASE_DISTRO_CODENAME}" "${SRC_APT}"
-
-    schroot -e -c ${session_id}
-    schroot_delete_configs
-}
-
-addtask apt_fetch
-do_apt_fetch[lockfiles] += "${REPO_ISAR_DIR}/isar.lock"
-do_apt_fetch[network] = "${TASK_USE_NETWORK_AND_SUDO}"
-
-# Add dependency from the correct schroot: host or target
-do_apt_fetch[depends] += "${SCHROOT_DEP}"
-
-do_apt_unpack() {
-    rm -rf ${S}
-    schroot_create_configs
-
-    session_id=$(schroot -q -b -c ${SBUILD_CHROOT})
-    echo "Started session: ${session_id}"
-
-    schroot_cleanup() {
-        schroot -q -f -e -c ${session_id} > /dev/null 2>&1
-        schroot_delete_configs
-    }
-    trap 'exit 1' INT HUP QUIT TERM ALRM USR1
-    trap 'schroot_cleanup' EXIT
-
-    schroot -r -c ${session_id} -d / -u root -- \
-        rm /etc/apt/sources.list.d/isar-apt.list /etc/apt/preferences.d/isar-apt
-    schroot -r -c ${session_id} -d / -- \
-        sh -c '
-            set -e
-            for uri in $2; do
-                dscfile="$(apt-get -y -qq --print-uris --only-source source $uri | cut -d " " -f2 | grep -E "*.dsc")"
-                cd ${PP}
-                cp /downloads/deb-src/"${1}"/${uri}/* ${PP}
-                dpkg-source -x "${dscfile}" "${PPS}"
-            done' \
-                my_script "${BASE_DISTRO}-${BASE_DISTRO_CODENAME}" "${SRC_APT}"
-
-    schroot -e -c ${session_id}
-    schroot_delete_configs
-}
-do_apt_unpack[network] = "${TASK_USE_SUDO}"
-
-addtask apt_unpack after do_apt_fetch
-
-do_cleanall_apt[nostamp] = "1"
-do_cleanall_apt() {
-    for uri in "${SRC_APT}"; do
-        rm -rf "${DEBSRCDIR}/${BASE_DISTRO}-${BASE_DISTRO_CODENAME}/$uri"
-    done
-}
-
 def get_package_srcdir(d):
     s = os.path.abspath(d.getVar("S"))
     workdir = os.path.abspath(d.getVar("WORKDIR"))
diff --git a/meta/lib/aptsrc_fetcher.py b/meta/lib/aptsrc_fetcher.py
new file mode 100644
index 00000000..ee726202
--- /dev/null
+++ b/meta/lib/aptsrc_fetcher.py
@@ -0,0 +1,93 @@ 
+# This software is a part of ISAR.
+# Copyright (c) Siemens AG, 2024
+#
+# SPDX-License-Identifier: MIT
+
+from bb.fetch2 import FetchError
+from bb.fetch2 import FetchMethod
+from bb.fetch2 import logger
+from bb.fetch2 import runfetchcmd
+
+class AptSrc(FetchMethod):
+    def supports(self, ud, d):
+        return ud.type in ['apt']
+
+    def urldata_init(self, ud, d):
+        ud.src_package = ud.url[len('apt://'):]
+        ud.host = ud.host.replace('=', '_')
+
+        base_distro = d.getVar('BASE_DISTRO')
+
+        # For these distros we know that the same version means the same
+        # source package, also across distro releases.
+        distro_suffix = '' if base_distro in ['debian', 'ubuntu'] else \
+            '-' + d.getVar('BASE_DISTRO_CODENAME')
+
+        ud.localfile='deb-src/' + base_distro + distro_suffix + '/' + ud.host
+
+    def download(self, ud, d):
+        bb.utils.exec_flat_python_func('isar_export_proxies', d)
+        bb.build.exec_func('schroot_create_configs', d)
+
+        sbuild_chroot = d.getVar('SBUILD_CHROOT')
+        session_id = runfetchcmd(f'schroot -q -b -c {sbuild_chroot}', d).strip()
+        logger.info(f'Started session: {session_id}')
+
+        repo_isar_dir = d.getVar('REPO_ISAR_DIR')
+        lockfile = bb.utils.lockfile(f'{repo_isar_dir}/isar.lock')
+
+        try:
+            runfetchcmd(f'''
+                set -e
+                schroot -r -c {session_id} -d / -u root -- \
+                    rm /etc/apt/sources.list.d/isar-apt.list /etc/apt/preferences.d/isar-apt
+                schroot -r -c {session_id} -d / -- \
+                    sh -c '
+                        set -e
+                        mkdir -p /downloads/{ud.localfile}
+                        cd /downloads/{ud.localfile}
+                        apt-get -y --download-only --only-source source {ud.src_package}
+                        '
+                ''', d)
+        except (OSError, FetchError):
+            raise
+        finally:
+            bb.utils.unlockfile(lockfile)
+            runfetchcmd(f'schroot -q -f -e -c {session_id}', d)
+            bb.build.exec_func('schroot_delete_configs', d)
+
+    def unpack(self, ud, rootdir, d):
+        bb.build.exec_func('schroot_create_configs', d)
+
+        sbuild_chroot = d.getVar('SBUILD_CHROOT')
+        session_id = runfetchcmd(f'schroot -q -b -c {sbuild_chroot}', d).strip()
+        logger.info(f'Started session: {session_id}')
+
+        pp = d.getVar('PP')
+        pps = d.getVar('PPS')
+        try:
+            runfetchcmd(f'''
+                set -e
+                schroot -r -c {session_id} -d / -u root -- \
+                    rm /etc/apt/sources.list.d/isar-apt.list /etc/apt/preferences.d/isar-apt
+                schroot -r -c {session_id} -d / -- \
+                    sh -c '
+                        set -e
+                        dscfile=$(apt-get -y -qq --print-uris --only-source source {ud.src_package} | \
+                                  cut -d " " -f2 | grep -E "\.dsc")
+                        cp /downloads/{ud.localfile}/* {pp}
+                        cd {pp}
+                        mv -f {pps} {pps}.prev
+                        dpkg-source -x "$dscfile" {pps}
+                        find {pps}.prev -mindepth 1 -maxdepth 1 -exec mv {{}} {pps}/ \;
+                        rmdir {pps}.prev
+                        '
+                ''', d)
+        except (OSError, FetchError):
+            raise
+        finally:
+            runfetchcmd(f'schroot -q -f -e -c {session_id}', d)
+            bb.build.exec_func('schroot_delete_configs', d)
+
+    def clean(self, ud, d):
+        bb.utils.remove(ud.localpath, recurse=True)