private void removeFilesInIndex( List<Object> filesToRemove, List<IIndexFragmentFile> indexFilesToRemove, IProgressMonitor monitor) throws InterruptedException, CoreException { if (!filesToRemove.isEmpty() || !indexFilesToRemove.isEmpty()) { fIndex.acquireWriteLock(); try { for (Object tu : filesToRemove) { if (monitor.isCanceled()) { return; } IIndexFileLocation ifl = fResolver.resolveFile(tu); if (ifl == null) continue; IIndexFragmentFile[] ifiles = fIndex.getWritableFiles(ifl); for (IIndexFragmentFile ifile : ifiles) { fIndex.clearFile(ifile); } incrementRequestedFilesCount(-1); } for (IIndexFragmentFile ifile : indexFilesToRemove) { if (monitor.isCanceled()) { return; } fIndex.clearFile(ifile); incrementRequestedFilesCount(-1); } } finally { fIndex.releaseWriteLock(); } } filesToRemove.clear(); }
private void setResume(boolean value) throws InterruptedException, CoreException { fIndex.acquireWriteLock(); try { fIndex .getWritableFragment() .setProperty(IIndexFragment.PROPERTY_RESUME_INDEXER, String.valueOf(value)); } finally { fIndex.releaseWriteLock(); } }
public static void buildIndex(List<String> files) throws Exception { File writableIndex = new File("/tmp/standaloneIndexFile.pdom"); IIndexLocationConverter converter = new MyIndexLocationConverter(); // See PDOMCPPLinkageFactory if needed .. Map<String, IPDOMLinkageFactory> linkageFactoryMappings = new HashMap<String, IPDOMLinkageFactory>(); linkageFactoryMappings.put(ILinkage.C_LINKAGE_NAME, new PDOMCLinkageFactory()); linkageFactoryMappings.put(ILinkage.CPP_LINKAGE_NAME, new PDOMCPPLinkageFactory()); IStandaloneScannerInfoProvider scannerProvider = new MyStandaloneScannerInfoProvider(); ILanguageMapper mapper = new MyLanguageMapper(); IParserLogService log = new MyParserLogService(); IProgressMonitor monitor = new MyProgressMonitor(); StandaloneIndexer indexer = new StandaloneFastIndexer( writableIndex, converter, linkageFactoryMappings, scannerProvider, mapper, log); indexer.setTraceStatistics(true); indexer.rebuild(files, monitor); IWritableIndex i = indexer.getIndex(); GlobalResultCollector.bindingindex = i; i.acquireReadLock(); try { System.out.println("internals"); for (IIndexFile file : i.getAllFiles()) { System.out.printf("Index has file: %s\n", file.toString()); } char[] prefix = {}; for (IIndexBinding binding : i.findBindingsForPrefix(prefix, false, IndexFilter.ALL, monitor)) { System.out.println(binding); } System.out.println(i); } finally { i.releaseReadLock(); } }
public IIndexFragmentFile[] getAvailableIndexFiles(int linkageID, IIndexFileLocation ifl) throws CoreException { IIndexFragmentFile[] files = fIndexFilesCache.get(ifl); if (files == null) { IIndexFragmentFile[] fragFiles = fIndex.getWritableFiles(linkageID, ifl); int j = 0; for (int i = 0; i < fragFiles.length; i++) { if (fragFiles[i].hasContent()) { if (j != i) fragFiles[j] = fragFiles[i]; j++; } } if (j == fragFiles.length) { files = fragFiles; } else { files = new IIndexFragmentFile[j]; System.arraycopy(fragFiles, 0, files, 0, j); } fIndexFilesCache.put(ifl, files); } return files; }
private void parseLinkage(int linkageID, List<IIndexFileLocation> files, IProgressMonitor monitor) throws CoreException, InterruptedException { LinkageTask map = findRequestMap(linkageID); if (map == null || files == null || files.isEmpty()) return; // First parse the required sources for (Iterator<IIndexFileLocation> it = files.iterator(); it.hasNext(); ) { IIndexFileLocation ifl = it.next(); LocationTask locTask = map.find(ifl); if (locTask == null || locTask.isCompleted()) { it.remove(); } else if (locTask.fKind == UpdateKind.REQUIRED_SOURCE) { if (monitor.isCanceled() || hasUrgentTasks()) return; final Object tu = locTask.fTu; final IScannerInfo scannerInfo = getScannerInfo(linkageID, tu); parseFile(tu, getLanguage(tu, linkageID), ifl, scannerInfo, null, monitor); } } // Files with context for (Iterator<IIndexFileLocation> it = files.iterator(); it.hasNext(); ) { IIndexFileLocation ifl = it.next(); LocationTask locTask = map.find(ifl); if (locTask == null || locTask.isCompleted()) { it.remove(); } else { for (FileVersionTask versionTask : locTask.fVersionTasks) { if (versionTask.fOutdated) { if (monitor.isCanceled() || hasUrgentTasks()) return; parseVersionInContext( linkageID, map, ifl, versionTask, locTask.fTu, new LinkedHashSet<IIndexFile>(), monitor); } } } } // Files without context for (Iterator<IIndexFileLocation> it = files.iterator(); it.hasNext(); ) { IIndexFileLocation ifl = it.next(); LocationTask locTask = map.find(ifl); if (locTask == null || locTask.isCompleted()) { it.remove(); } else { if (locTask.needsVersion()) { if (monitor.isCanceled() || hasUrgentTasks()) return; final Object tu = locTask.fTu; final IScannerInfo scannerInfo = getScannerInfo(linkageID, tu); parseFile(tu, getLanguage(tu, linkageID), ifl, scannerInfo, null, monitor); if (locTask.isCompleted()) it.remove(); } } } // Delete remaining files. fIndex.acquireWriteLock(); try { for (IIndexFileLocation ifl : files) { LocationTask locTask = map.find(ifl); if (locTask != null && !locTask.isCompleted()) { if (!locTask.needsVersion()) { if (monitor.isCanceled() || hasUrgentTasks()) return; Iterator<FileVersionTask> it = locTask.fVersionTasks.iterator(); while (it.hasNext()) { FileVersionTask v = it.next(); if (v.fOutdated) { fIndex.clearFile(v.fIndexFile); reportFile(true, locTask.fKind); locTask.removeVersionTask(it); fIndexContentCache.remove(v.fIndexFile); fIndexFilesCache.remove(ifl); } } } } } } finally { fIndex.releaseWriteLock(); } }
private void extractFiles( HashMap<Integer, List<IIndexFileLocation>> files, List<IIndexFragmentFile> iFilesToRemove, IProgressMonitor monitor) throws CoreException { final boolean forceAll = (fUpdateFlags & IIndexManager.UPDATE_ALL) != 0; final boolean checkTimestamps = (fUpdateFlags & IIndexManager.UPDATE_CHECK_TIMESTAMPS) != 0; final boolean checkFileContentsHash = (fUpdateFlags & IIndexManager.UPDATE_CHECK_CONTENTS_HASH) != 0; final boolean forceUnresolvedIncludes = (fUpdateFlags & IIndexManager.UPDATE_UNRESOLVED_INCLUDES) != 0; final boolean both = fIndexHeadersWithoutContext == UnusedHeaderStrategy.useBoth; int count = 0; int forceFirst = fForceNumberFiles; BitSet linkages = new BitSet(); for (final Object tu : fFilesToUpdate) { if (monitor.isCanceled()) return; final boolean force = forceAll || --forceFirst >= 0; final IIndexFileLocation ifl = fResolver.resolveFile(tu); if (ifl == null) continue; final IIndexFragmentFile[] indexFiles = fIndex.getWritableFiles(ifl); final boolean isSourceUnit = fResolver.isSourceUnit(tu); linkages.clear(); final boolean regularContent = isRequiredInIndex(tu, ifl, isSourceUnit); final boolean indexedUnconditionally = fResolver.isIndexedUnconditionally(ifl); if (regularContent || indexedUnconditionally) { // Headers or sources required with a specific linkage final UpdateKind updateKind = isSourceUnit ? UpdateKind.REQUIRED_SOURCE : regularContent && both ? UpdateKind.REQUIRED_HEADER : UpdateKind.ONE_LINKAGE_HEADER; if (regularContent || indexFiles.length == 0) { AbstractLanguage[] langs = fResolver.getLanguages(tu, fIndexHeadersWithoutContext); for (AbstractLanguage lang : langs) { int linkageID = lang.getLinkageID(); boolean foundInLinkage = false; for (int i = 0; i < indexFiles.length; i++) { IIndexFragmentFile ifile = indexFiles[i]; if (ifile != null && ifile.getLinkageID() == linkageID && ifile.hasContent()) { foundInLinkage = true; indexFiles[i] = null; // Take the file. boolean update = force || (forceUnresolvedIncludes && ifile.hasUnresolvedInclude()) || isModified(checkTimestamps, checkFileContentsHash, ifl, tu, ifile); if (update && requestUpdate(linkageID, ifl, ifile, tu, updateKind)) { count++; linkages.set(linkageID); } } } if (!foundInLinkage && requestUpdate(linkageID, ifl, null, tu, updateKind)) { linkages.set(linkageID); count++; } } } } // Handle other files present in index. for (IIndexFragmentFile ifile : indexFiles) { if (ifile != null) { IIndexInclude ctx = ifile.getParsedInContext(); if (ctx == null && !indexedUnconditionally && ifile.hasContent()) { iFilesToRemove.add(ifile); count++; } else { boolean update = force || (forceUnresolvedIncludes && ifile.hasUnresolvedInclude()) || isModified(checkTimestamps, checkFileContentsHash, ifl, tu, ifile); final int linkageID = ifile.getLinkageID(); if (update && requestUpdate(linkageID, ifl, ifile, tu, UpdateKind.OTHER_HEADER)) { count++; linkages.set(linkageID); } } } } for (int lid = linkages.nextSetBit(0); lid >= 0; lid = linkages.nextSetBit(lid + 1)) { addPerLinkage(lid, ifl, files); } } synchronized (this) { incrementRequestedFilesCount(count - fFilesToUpdate.length); fFilesToUpdate = null; } }
public final void runTask(IProgressMonitor monitor) throws InterruptedException { try { if (!fIndexFilesWithoutConfiguration) { fIndexHeadersWithoutContext = UnusedHeaderStrategy.skip; } fIndex = createIndex(); if (fIndex == null) { return; } fTodoTaskUpdater = createTodoTaskUpdater(); fASTOptions = ILanguage.OPTION_NO_IMAGE_LOCATIONS | ILanguage.OPTION_SKIP_TRIVIAL_EXPRESSIONS_IN_AGGREGATE_INITIALIZERS; if (getSkipReferences() == SKIP_ALL_REFERENCES) { fASTOptions |= ILanguage.OPTION_SKIP_FUNCTION_BODIES; } fIndex.resetCacheCounters(); fIndex.acquireReadLock(); try { try { // Split into sources and headers, remove excluded sources. HashMap<Integer, List<IIndexFileLocation>> files = new HashMap<Integer, List<IIndexFileLocation>>(); final ArrayList<IIndexFragmentFile> indexFilesToRemove = new ArrayList<IIndexFragmentFile>(); extractFiles(files, indexFilesToRemove, monitor); setResume(true); // Remove files from index removeFilesInIndex(fFilesToRemove, indexFilesToRemove, monitor); HashMap<Integer, List<IIndexFileLocation>> moreFiles = null; while (true) { for (int linkageID : getLinkagesToParse()) { final List<IIndexFileLocation> filesForLinkage = files.get(linkageID); if (filesForLinkage != null) { parseLinkage(linkageID, filesForLinkage, monitor); for (Iterator<LocationTask> it = fOneLinkageTasks.values().iterator(); it.hasNext(); ) { LocationTask task = it.next(); if (task.isCompleted()) it.remove(); } fIndexContentCache.clear(); fIndexFilesCache.clear(); } if (hasUrgentTasks()) break; } synchronized (this) { if (fUrgentTasks.isEmpty()) { if (moreFiles == null) { // No urgent tasks and no more files to parse. We are done. fTaskCompleted = true; break; } else { files = moreFiles; moreFiles = null; } } } AbstractIndexerTask urgentTask; while ((urgentTask = getUrgentTask()) != null) { // Move the lists of not yet parsed files from 'files' to 'moreFiles'. if (moreFiles == null) { moreFiles = files; } else { for (Map.Entry<Integer, List<IIndexFileLocation>> entry : files.entrySet()) { List<IIndexFileLocation> list = moreFiles.get(entry.getKey()); if (list == null) { moreFiles.put(entry.getKey(), entry.getValue()); } else { list.addAll(0, entry.getValue()); } } } // Extract files from the urgent task. files = new HashMap<Integer, List<IIndexFileLocation>>(); fFilesToUpdate = urgentTask.fFilesToUpdate; fForceNumberFiles = urgentTask.fForceNumberFiles; fFilesToRemove = urgentTask.fFilesToRemove; incrementRequestedFilesCount(fFilesToUpdate.length + fFilesToRemove.size()); extractFiles(files, indexFilesToRemove, monitor); removeFilesInIndex(fFilesToRemove, indexFilesToRemove, monitor); } } if (!monitor.isCanceled()) { setResume(false); } } finally { fIndex.flush(); } } catch (CoreException e) { logException(e); } finally { fIndex.releaseReadLock(); } } finally { synchronized (this) { fTaskCompleted = true; } } }