[Ma...] Posted Thursday at 02:41 PM Share Posted Thursday at 02:41 PM I have been working with Calypso for a long time, but have never had to configure the outputs to be in excel format. Im sure this is easy, but can someone point me in the right direction? We are trying a new inspection software. I already have table files turned on because we are using QC Calc which eats the CHR file. So I am guessing I need to add another output, but don't know which format to pick. 1 Link to comment Share on other sites More sharing options...
[Ja...] Posted Thursday at 04:41 PM Share Posted Thursday at 04:41 PM Hey Mathew, there are a few different options. This previous thread should have enough info to get you started: Link to comment Share on other sites More sharing options...
[Da...] Posted Thursday at 05:06 PM Share Posted Thursday at 05:06 PM I am also trying to figure out how to send to excel I have Engineers wanting parts run to excel so they can collect data. Link to comment Share on other sites More sharing options...
[Ch...] Posted Thursday at 05:10 PM Share Posted Thursday at 05:10 PM (edited) I have an email with some details from Zeiss NC about a year ago .. Below is the information about the DataBridge Excel macro that easily moves sets of data from Calypso to Excel. ------------------------------------------------- There are always questions from customers looking to efficiently export CALYPSO measurement data to Excel. One “easy” solution to this request, if all you are needing to do is get measured data to Excel, is to use a Macro in Excel to read CALYPSO table files. CALYPSO has the ability to create table files without any add-on software. ZEISS Metrology Services offers a “CALYPSO Data Handling Webinar” that covers many different topics, and provides the “DataBridge” Excel sheet with macro you can use to easily import measurement data in Excel from CALYPSO. Attached are some examples of output directly possible from DataBridge. Note that if you are intending on doing higher-level statistical analysis on measurement data beyond simple export to Excel, PiWeb is the correct tool. The Webinar session includes: Up to 25 attendees for your company One hour of Instructor-Lead interactive Training 8 pages of notes sent to all attendees The DataBridge Excel macro sent to all attendees Up to 1 Hour Phone Support as needed for Data Handling questions within one month of Webinar Topics Discussed in Webinar: Saving Measured Points Subsequent Evaluation Custom Naming of Point and Result Files Creating User-Defined Variables Displaying User-Defined Variables in Reports in PiWeb Creating Table Files Exporting of Multiple Results to Excel via Databridge If you are interested in the Data Handling Webinar, please let us know. The cost for the webinar and associated materials is $912. The cost of the webinar includes up to a month of DataBridge-related support if needed. Your Metrology Services salesperson can provide a quotation to you for the webinar. ------------------------------------------------------------------------------------------------------------------------------------------- I have some 'example' excel files, but don't want to share them just in case... I was unable to find anything on this on the Portal or Online Learning Academy Edited Thursday at 05:10 PM Link to comment Share on other sites More sharing options...
[DW...] Posted Thursday at 05:54 PM Share Posted Thursday at 05:54 PM Please sign in to view this username. $900 for a macro 🤣 Here is my contribution for free. If anyone comes across this post in the future, here is my personal contribution to automatically compiling many XLS reports created by Calypso into something that cuts down reporting time by 99% (not a typo). This is a PowerShell script that once you give it the path to your folder containing the XLS files, it manipulates them with the final result being a file called "compiledExcel.xlsx" located in the same folder. Copy and paste the block of data into whatever official report you have, and that's it. Massive amounts of data compiled quickly and accurately. This script can be further refined, but the heavy lifting is here. <# .SYNOPSIS Converts .xls to .xlsx, trims specified ranges, and compiles data into a master Excel file. #> param( [Parameter(Mandatory = $false)] [string]$Directory = "C:\PATH TO YOUR FILES" ) # --- Helper: Logging function --- function Write-Log { param([string]$Message, [string]$Level = "INFO") $timestamp = (Get-Date).ToString("yyyy-MM-dd HH:mm:ss") Write-Host "[$timestamp][$Level] $Message" } # --- Verify directory --- if (-not (Test-Path $Directory)) { Write-Error "Directory not found: $Directory" exit 1 } Write-Log "Processing files in directory: $Directory" # --- Start Excel safely --- try { $excel = New-Object -ComObject Excel.Application $excel.Visible = $false $excel.DisplayAlerts = $false $excel.AutomationSecurity = 3 # Disable macros Write-Log "Excel instance started successfully." } catch { Write-Error "Failed to start Excel COM object. Ensure Excel is installed. $_" exit 1 } # --- Function: Safely close and release COM object --- function Release-ComObject { param($obj) if ($null -ne $obj) { [void][System.Runtime.InteropServices.Marshal]::ReleaseComObject($obj) Remove-Variable obj -ErrorAction SilentlyContinue } } # --- Convert .xls to .xlsx --- Write-Log "Converting .xls files to .xlsx..." Get-ChildItem -Path $Directory -Filter "*.xls" -ErrorAction SilentlyContinue | ForEach-Object { $xlsFile = $_.FullName $newFileName = [System.IO.Path]::ChangeExtension($xlsFile, ".xlsx") try { Write-Log "Converting $xlsFile..." $workbook = $excel.Workbooks.Open($xlsFile) $workbook.SaveAs($newFileName, [Microsoft.Office.Interop.Excel.XlFileFormat]::xlOpenXMLWorkbook) $workbook.Close($false) Write-Log "Converted: $xlsFile ? $newFileName" # Uncomment next line to delete original # Remove-Item $xlsFile -Force } catch { Write-Warning "Failed to convert $xlsFile $($_)" } finally { if ($null -ne $workbook) { Release-ComObject -obj $workbook } } } # --- Process .xlsx files (remove A1:H13) --- Write-Log "Trimming range A1:H13 from all .xlsx files..." Get-ChildItem -Path $Directory -Filter "*.xlsx" -ErrorAction SilentlyContinue | ForEach-Object { $xlsxFile = $_.FullName try { Write-Log "Processing $xlsxFile..." $workbook = $excel.Workbooks.Open($xlsxFile) $worksheet = $workbook.Sheets.Item(1) $worksheet.Range("A1:H13").Delete() $workbook.Save() Write-Log "Trimmed file: $xlsxFile" } catch { Write-Warning "Error processing file $xlsxFile $_" } finally { if ($null -ne $workbook) { $workbook.Close($false) Release-ComObject -obj $workbook } } } # --- Compile into one Excel file --- Write-Log "Compiling data into master file..." $destinationExcelFilePath = Join-Path $Directory "compiledExcel.xlsx" if (Test-Path $destinationExcelFilePath) { Remove-Item $destinationExcelFilePath -Force -ErrorAction SilentlyContinue Write-Log "Removed existing compiledExcel.xlsx" } try { $destinationWorkbook = $excel.Workbooks.Add() $destinationWorksheet = $destinationWorkbook.Sheets.Item(1) $columnIndex = 1 Get-ChildItem -Path $Directory -Filter "*.xlsx" | Sort-Object Name | ForEach-Object { $sourcePath = $_.FullName try { Write-Log "Copying data from $sourcePath" $sourceWorkbook = $excel.Workbooks.Open($sourcePath) $sourceWorksheet = $sourceWorkbook.Sheets.Item(1) $sourceValues = $sourceWorksheet.Range("B1:B1000").Value2 if ($sourceValues) { $destinationWorksheet.Cells.Item(1, $columnIndex).Resize(1000, 1).Value2 = $sourceValues Write-Log "Copied column $columnIndex from $sourcePath" $columnIndex++ } else { Write-Warning "No data found in B1:B1000 for $sourcePath" } } catch { Write-Warning "Error reading from $sourcePath $_" } finally { if ($null -ne $sourceWorkbook) { $sourceWorkbook.Close($false) Release-ComObject -obj $sourceWorkbook } } } $destinationWorkbook.SaveAs($destinationExcelFilePath) Write-Log "Compiled workbook saved to $destinationExcelFilePath" } catch { Write-Error "Error compiling Excel files: $_" } finally { if ($null -ne $destinationWorkbook) { $destinationWorkbook.Close($true) Release-ComObject -obj $destinationWorkbook } } # --- Cleanup Excel --- Write-Log "Cleaning up Excel COM objects..." try { $excel.Quit() Release-ComObject -obj $excel [System.GC]::Collect() [System.GC]::WaitForPendingFinalizers() Write-Log "Excel closed and resources released." } catch { Write-Warning "Error during Excel cleanup: $_" } Write-Log "Script completed successfully." Link to comment Share on other sites More sharing options...
[Ch...] Posted Thursday at 06:10 PM Share Posted Thursday at 06:10 PM I agree, this is probably why it is no longer found / offered, because AI essentially made it obsolete Link to comment Share on other sites More sharing options...
[Ch...] Posted Thursday at 06:45 PM Share Posted Thursday at 06:45 PM As everyone is alluding to here, I think the best way is to simply reuse the characteristic table file, _chr.txt, and then manipulate it. I highly recommend letting calpyso send it to its default folder, then use some sort of script to manipulate it and copy it to whatever folder you want. For example, we use 'report_end.bat' files to move and copy everything that comes of the cmm. We take our tables files and send them to multiple different places depending on the wants and needs of various departments. Remember, with AI, everyone is a programmer now. 1 Link to comment Share on other sites More sharing options...
Recommended Posts
Please sign in to comment
You will be able to leave a comment after signing in