File 0001-Avoid-infinite-loops-when-fetching-the-URL-from-Docu.patch of Package baloo5
From f57c4bc27b81f2b40d910d376a0bb531cd8472c6 Mon Sep 17 00:00:00 2001
From: =?UTF-8?q?Stefan=20Br=C3=BCns?= <stefan.bruens@rwth-aachen.de>
Date: Thu, 19 Apr 2018 02:20:33 +0200
Subject: [PATCH] Avoid infinite loops when fetching the URL from DocumentUrlDB
Summary:
Some users apparently have DBs which contain infinite loops in the idTree
(a parentId pointing to itself or one of its children). This manifests in
either crashes due to exhausted memory and/or the process being stuck
with 100% CPU load.
The problem can only be completely solved by either recreating the DB, or
by cleaning the DB from any problematic records. As in interim solution,
stop the code from crashing.
BUG: 378754
CCBUG: 385846
CCBUG: 391258
CCBUG: 393181
Reviewers: #baloo, michaelh, #frameworks, ngraham
Reviewed By: ngraham
Subscribers: ngraham, #frameworks
Tags: #frameworks, #baloo
Differential Revision: https://phabricator.kde.org/D12335
---
src/engine/documenturldb.cpp | 12 ++++++++++--
1 file changed, 10 insertions(+), 2 deletions(-)
diff --git a/src/engine/documenturldb.cpp b/src/engine/documenturldb.cpp
index c0470100..1aa51181 100644
--- a/src/engine/documenturldb.cpp
+++ b/src/engine/documenturldb.cpp
@@ -138,10 +138,18 @@ QByteArray DocumentUrlDB::get(quint64 docId) const
QByteArray ret = path.name;
quint64 id = path.parentId;
+ // arbitrary path depth limit - we have to deal with
+ // possibly corrupted DBs out in the wild
+ int depth_limit = 512;
+
while (id) {
auto p = idFilenameDb.get(id);
- //FIXME: this prevents sanitzing
- // reactivate Q_ASSERT(!p.name.isEmpty());
+ if (p.name.isEmpty()) {
+ return QByteArray();
+ }
+ if (!depth_limit--) {
+ return QByteArray();
+ }
ret = p.name + '/' + ret;
id = p.parentId;
--
2.16.3